} ?>
Ping An Securities released a research report saying that the emergence of Kimi's lossless long text model has solved many pain points in the application of large models and opened up the application space of large models. The final landing of AIGC, or to find a suitable scenario, Kimi as a hundred billion large model, can support complex operations, but also can accept and process large text, solve a lot of large model practical application problems, the subsequent commercialization potential will be highlighted. At present, Kimi's intelligent assistant has been launched on multi-terminal platforms such as Apple iOS applications, Android applications, applets, and web pages. Continue to be optimistic about the AIGC industry chain, especially the application potential of large models.
event: recently, the dark side of the month (Moonshot AI), a domestic artificial intelligence company, announced in its wechat public number "Moonshot AI" that the company's Kimi intelligent assistant has made a breakthrough in long context window technology, with a lossless context length of 2 million words. Previously, in October 2023, the company's intelligent assistant can achieve 200000 lossless context length, the latest ability to improve an order of magnitude.
Ping An Securities views are as follows:
Kimi's intelligent assistant has been greatly improved in the short term, and the popularity in China has increased rapidly.
the length of the lossless context released this time is 2 million words, which is only about 5 months apart from the previous 200000 words. The dark side of the company was only established in April 2023, and it took less than a year to establish. Kimi Intelligent Assistant, also known as Kimi Chat, is a conversational AI assistant product built on the dark side of the moon based on a large model of 100 billion self-developed parameters, and will be officially launched in November 2023. The strongest capability of the product lies in long context processing, including long text summary and generation, online search, data processing, code writing, user interaction and translation. After the launch, the heat of the tool quickly increased. According to Similarweb data, Kimi's traffic volume has increased significantly in recent weeks. The traffic volume of the website in the last four weeks (2.20-2.26, 2.27-3.4, 3.5-3.11, 3.12-3.18) is 1.003 million, 1.128 million, 1.52 million and 2.25 million respectively. Although the company has continued to expand its servers, the pressure has begun to show in the face of rapid user growth.
Long text promises to open a new world of big model applications.
The amount of parameters in a large model determines how complex "computations" it can support, while how much text input it can receive (I. e., long text techniques) determines how much "memory" the large model has, both of which together determine how well the model is applied. At present, the current situation that the input length of large models is generally low has a great restriction on its technical landing, such as virtual characters will "forget" some important information, Agent can not obtain comprehensive input information may fail to run, some game products are forced to simplify the plot due to the inability to handle long text capabilities. Kimi's support for a longer context means that large models have more "memory", thus making the application of large models more in-depth and extensive. The company's official public number shows that Kimi can conduct market analysis through multiple financial reports, handle ultra-long legal contracts, quickly sort out the key information of multiple articles or multiple web pages, play roles based on novel settings, etc. The order of magnitude increase in the length of the large model lossless context will help users open their imagination to AI application scenarios in the future, including the analysis and understanding of the complete code base, the intelligent agent that can autonomously complete multi-step complex tasks, the lifelong assistant that will not forget key information, and the multi-modal model of the true unified architecture.
"Lossless compression" and text length promotion are the requirements that long text technology needs to take into account.
The founder of the company said that if general artificial intelligence is to be realized, lossless long context will be a key foundation technology. All the evolution of model architecture in history is essentially to improve the effective and lossless context length. In the process of context length promotion, both length and lossless compression level need to be considered in order to scale meaningfully. Judging from the interval of this upgrade, the time is very short. It can be seen that the company has not taken a gradual iteration route. Of course, the technical difficulties it faces should be greater. The company's R & D and technical teams have redesigned and developed the original from model pre-training to alignment and reasoning. They have realized a lossless long-range attention mechanism under hundreds of billions of parameters. They do not rely on "shortcut" schemes that damage performance greatly, such as sliding window, down sampling and small model, and take into account the two indexes of length and "lossless.
Subject:
1) in terms of computing power, it is recommended to pay attention to wave information (000977.SZ), zhongke shuguang (603019.SH), purple light (000938.SZ), etc. it is recommended to pay attention to industrial rich union (601138.SH), cambrian (688256.SH), jingjiawei (300474.SZ), hi-tech development (000628.SZ), etc.
2) In terms of algorithms, HKUST Xunfei (002230.SZ) is recommended;
3) In terms of application scenarios, it is strongly recommended that Zhongke Chuangda (300496.SZ), Hang Seng Electronics (600570.SH), Shengshi Technology (002990.SZ), etc.;
4) In terms of network security, Qixing Chen (002439.SZ) is highly recommended.
Risk Tips: 1) Risks of domestic computing power not increasing as fast as expected; 2) Compliance regulatory risks such as copyright; 3) Risks of technological evolution.
Ticker Name
Percentage Change
Inclusion Date