Chinese Researchers Create Brain-Inspired AI Model Rivaling ChatGPT
Dou Shicong
DATE:  15 hours ago
/ SOURCE:  Yicai
Chinese Researchers Create Brain-Inspired AI Model Rivaling ChatGPT Chinese Researchers Create Brain-Inspired AI Model Rivaling ChatGPT

(Yicai) Sept. 11 -- Chinese researchers have developed a brain-inspired large language model that promises more energy-efficient reasoning than mainstream systems such as OpenAI’s ChatGPT and Google’s BERT.

The model, called SpikingBrain, was created by a team led by Li Guoqi and Xu Bo at the Institute of Automation under the Chinese Academy of Sciences, according to a recently published study. It was trained on hundreds of graphics processing units supplied by Shanghai-based chipmaker MetaX.

SpikingBrain represents a non-transformer path for AI development, Xu Bo, director of the institute, said to Xinhua News Agency. “It might inspire the design of next-generation neuromorphic chips with lower power consumption.” Neuromorphic chips are processors modeled on the brain’s architecture.

Unlike transformer-based models such as ChatGPT, which require heavy computational resources, SpikingBrain relies on event-driven spiking neurons that mimic the brain’s adaptive, energy-efficient signaling. This approach enables training with significantly smaller datasets.

Using only about 2 percent of the pre-training data consumed by mainstream large models, SpikingBrain matched the performance of several open-source benchmarks in language understanding and reasoning tasks, Xinhua reported.

The LLM's efficiency stems from spike-based thresholds rather than dense attention mechanisms, allowing fast and sparse computation. For example, when processing a one-million-token input, one version of SpikingBrain generated the first output token nearly 27 times faster than a comparable transformer model.

The system also shows advantages in handling ultra-long sequences, making it suitable for areas such as legal and medical document analysis, high-energy physics, and DNA sequence modeling, according to the report.

Editor: Emmi Laine

Follow Yicai Global on
Keywords:   LLM,AI,SpikingBrain,China,CAS,ChatGPT,BERT,OpenAI,Google,large language model,transformer,non-transfomer,inference,model training,MetaX,Neuromorphic chips,GPU,AI news