Our Privacy Statement & Cookie Policy

By continuing to browse our site you agree to our use of cookies, revised Privacy Policy and Terms of Use. You can change your cookie settings through your browser.

I agree

China unveils brain-inspired AI that could redefine efficiency

GPU chips made by MetaX are displayed at the World AI Conference, Shanghai, China, July 29, 2025. /VCG
GPU chips made by MetaX are displayed at the World AI Conference, Shanghai, China, July 29, 2025. /VCG

GPU chips made by MetaX are displayed at the World AI Conference, Shanghai, China, July 29, 2025. /VCG

Chinese researchers have developed a new AI system, SpikingBrain-1.0, that breaks from the resource-hungry Transformer architecture used by models like ChatGPT. This new model, inspired by the human brain's neural mechanisms, charts a new course for energy-efficient computing.

Developed by a team at the Institute of Automation, Chinese Academy of Sciences, SpikingBrain-1.0 is a large-scale spiking neural network. Unlike mainstream AI that relies on ever-larger networks and data, this model allows intelligence to emerge from "spiking neurons," resulting in highly efficient training.

It achieves performance on par with many free-to-download models using only about 2 percent of the data required by competitors.

The model's efficiency is particularly evident when handling long data sequences. In one variant, SpikingBrain-1.0 showed a 26.5-fold speed-up over Transformer architectures when generating the first token from a one-million-token context. This makes it ideal for tasks in legal and medical document analysis, high-energy physics, and DNA sequencing.

A chart explaining the SpikingBrain model /CAS
A chart explaining the SpikingBrain model /CAS

A chart explaining the SpikingBrain model /CAS

SpikingBrain-1.0 was trained and inferred entirely on a Chinese domestic GPU platform, namely MetaX C550, marking a significant step for China's self-sufficient AI ecosystem. The researchers have released the model for free download, along with bilingual tech report.

According to Xu Bo, director of the Institute of Automation, this model "opens up a non-Transformer technical path for the new generation of AI development" and may inspire the design of low-power neuromorphic chips.

This follows a previous success from the same institute, where they collaborated with Swiss scientists to develop an energy-efficient neuromorphic chip called "Speck" with a remarkably low power consumption of just 0.42 milliwatts. This is a crucial step towards replicating the human brain's incredible efficiency, which operates on only about 20 watts of power.

Search Trends