China’s Daily Token Usage Jumps 40% in Three Months(Yicai) March 24 -- China’s use of artificial intelligence is increasing apace, with new data showing a surge in the daily usage of tokens, the smallest units of data processed by large language models, over the past three months.
Average daily token usage in China has exceeded 140 trillion so far this month, over 1,000 times more than the 100 billion at the start of 2024, and up 40 percent from 100 trillion at the end of last year, Liu Liehong, director of the National Data Administration, told reporters in Beijing today.
Tokens can be measured, priced, and traded, Liu noted, and a new value system is quickly forming around their usage, distribution, and settlement, becoming an important path for the commercialization of the AI industry.
The sharp uptick in daily token usage shows that China’s AI development has entered a period of rapid growth and that the sector’s competitiveness has strengthened greatly, Liu said. It also signifies the supply of datasets -- the structured collections of information that are used for training LLMs -- has swelled, the value of data elements continues to be released, and AI innovation and development powered by them has entered a virtuous cycle, he added.
This year’s Government Work Report, delivered to the legislature earlier this month, stated that efforts will be made to create a new form of intelligent economy and to further expand the “AI +” initiative, which aims to increase the use of AI in all sectors. It also called for further development and utilization of data resources and high-quality datasets.
2026 marks the start of China’s 15th Five-Year Plan and has also been designated as the “Year of Data Element Value Release” by the National Data Administration. By the end of last year, more than 100,000 high-quality datasets had been established nationwide, with a total volume exceeding 890 petabytes, equivalent to about 310 times the digital resources of the National Library of China.
Liu said the next step is to keep advancing data-enabled innovation in AI development and to work with all parties to implement a new action plan for high-quality data set construction. This will involve strengthening the foundation and expanding capacity, tackling annotation challenges, improving quality and efficiency, empowering applications, managing services, and releasing value.
The aim is to create technically feasible, practical, and quality-assured AI-ready high-quality datasets, Liu said.
At the same time, efforts will be made to accelerate the establishment of a unified national data property registration system, issue policies for building a national integrated data market, promote the construction of basic data systems and data infrastructure in a coordinated manner, and provide strong guarantees for releasing the value of data elements, he noted.
Editor: Tom Litting