} ?>
AI computing chips are the "engine of the AI era". The ChatGPT boom has triggered global technology companies to accelerate the deployment of AI large models, and Google, Meta, Baidu, Alibaba, Huawei, DeepSeek, etc. have successively launched large model products and continued to iterate and upgrade. The four major cloud vendors in North America have benefited from the promotion of AI to their core business and continue to increase capital expenditure, while the three major domestic Internet vendors have continued to increase capital expenditure, and the construction of domestic intelligent computing centers has accelerated, driving the rapid growth of computing power demand. According to IDC's forecast, the global computing power scale is expected to grow from 1397 EFLOPS in 2023 to 16 ZFLOPS in 2030, with a compound growth rate of 50% from 2023 to 2030. AI servers are the core infrastructure that supports generative AI applications, and AI computing chips provide the underlying support for AI servers and are the cornerstone of computing power. As the "engine of the AI era", AI computing chips are expected to enjoy the explosive wave of demand for AI computing power and promote the rapid development and wide application of AI technology.
AI computing power chips are dominated by GPUs, and the market for customized ASIC chips is growing rapidly. AI computing chips can be divided into cloud, edge, and terminal AI computing chips according to application scenarios. According to the design method and application of the chip, AI computing chips can be divided into general-purpose AI chips and special-purpose AI chips, and the current AI computing power chips are dominated by GPUs. According to Statista, the global GPU market size will be $43.6 billion in 2023, and the market size is expected to reach $274.2 billion in 2029, with a compound growth rate of 33.2% from 2024 to 2029. According to TechInsights, Nvidia dominated the global GPU market with a 98% market share in data center GPU shipments in 2023. The GPU ecosystem is complex, the construction period is long, and the difficulty is great, and the GPU ecosystem has established extremely high industry barriers. AI ASIC is a custom integrated circuit designed for artificial intelligence applications, with high performance, low power consumption, customization, and low cost. Due to NVIDIA's monopoly on the global data center GPU market, due to cost, differentiated competition, innovation, supply chain diversification and other reasons, the trend of self-developed chips by cloud manufacturers and other large manufacturers is obvious, which promotes the rapid growth of the data center custom ASIC chip market, and is expected to grow faster than that of general-purpose AI computing chips. According to Marvell, the market size of custom ASIC chips for data centers will be approximately $6.6 billion in 2023 and is expected to reach $42.9 billion in 2028, with a compound growth rate of 45% from 2023 to 2028. In recent years, the United States has continuously increased export controls on high-end GPUs, and domestic AI computing chip manufacturers have ushered in a golden period of development.
DeepSeek is expected to accelerate the development of domestic AI computing chips. DeepSeek achieves cost-effective large-scale model training and inference through technological innovation, which is mainly reflected in the use of Hybrid Expert (MoE) architecture, multi-head latent attention mechanism (MLA), FP8 mixed-precision training technology, multi-token prediction (MTP) and distillation technology. The performance of DeepSeek-V3 is benchmarked against GPT-4o, and the performance of DeepSeek-R1 is benchmarked against OpenAI o1. According to data published by DeepSeek on January 20, 2025, the cost of DeepSeek-R1 API calls is less than 5% of that of OpenAl o1. DeepSeek-R1 achieves extremely cost-effective model inference, and the distillation technology enables small models to have powerful inference capabilities and low cost, which will help AI applications to be implemented on a large scale and is expected to accelerate the release of inference demand. IDC predicts that by 2028, China's AI servers will account for 73% of inference workloads, and since the proportion of inference servers is much higher than that of training servers, there is a broader space for domestic substitution of AI computing chips for inference. The domestic computing power ecological chain has been fully adapted to DeepSeek, which improves the efficiency of AI computing chips through technological innovation, thereby accelerating the process of independent and controllable domestic AI computing chips, and domestic AI computing chip manufacturers are expected to accelerate their development and continue to increase their market share.
Investment advice. It is recommended to pay attention to Cambrian-U (688256) and Haiguang Information (688041) for cloud AI computing chips, VeriSilicon (688521) and Aojie Technology-U (688220) for customized ASIC chips, SMIC (688981) for advanced manufacturing, and JCET (600584) for advanced packaging.
Risk warning: the risk of intensifying international geopolitical conflicts, the risk of downstream demand being less than expected, the risk of intensifying market competition, the risk of new product research and development progress being less than expected, and the risk of domestic substitution being less than expected.
Ticker Name
Percentage Change
Inclusion Date