China's upgraded light-powered 'AGI chip' is now a million times more efficient than before, researchers say

Brain AI Chip technology concept (3D render)
(Image credit: BlackJack3D/Getty Images)

Scientists in China have unveiled Taichi-II, an upgraded version of their fully optical artificial intelligence (AI) chip that they say could one day power artificial general intelligence (AGI) systems.

The first Taichi chip was unveiled by researchers in April 2024. Instead of relying on electronic components, the tiny, modular device is powered by photons, or particles of light. These photons power tiny on-board electrical switches that turn on or off when voltage is applied.

Compared with its predecessor, Taichi-II is 40% more accurate in classification tasks, which involve sorting and identifying different types of information, and delivers a "six orders of magnitude" (i.e., a million-fold) improvement in energy efficiency in low-light conditions, South China Morning Post (SCMP) reported.

The researchers achieved this leap in performance by training the AI directly on the optical chip, rather than relying on digital simulations — a process the scientists called "fully forward mode." They described their findings in a study published Aug. 7 in the journal Nature.

Fully forward mode is an AI training method in which data moves in only one direction — forward. This differs from traditional training methods, where data is typically processed in multiple, iterative steps. As light passes through the chip, it interacts with tiny components that adjust its direction and modulate its phase and intensity. These cause immediate changes to the AI model's parameters, enabling it to learn in real time without repeated processing.

Fully forward mode helps photon chips work even faster than before, but these chips already have significant advantages over conventional chips. Light-based chips are far less energy-intensive and can perform calculations much faster than traditional chips as they can process multiple signals simultaneously. This is because photons, unlike electrons, can travel at the speed of light and don't generate heat as they move through the chip, leading to faster and more efficient processing.

Related: 'Quantum-inspired' laser computing is more effective than both supercomputing and quantum computing, startup claims

The Taichi chip works similarly to other light-based chips, but it can be scaled much better than competing designs, the researchers previously said. This is because the chiplet combines several advantages of existing photonic chips — including "optical diffraction and interference," which refers to how light is manipulated within the chiplet.

An AI chip that operates entirely photonically could eventually power AGI models — extremely powerful AI systems capable of human-like intelligence and reasoning, with the capacity to learn new skills that lie beyond the confines of its training data.

While this hypothetical technology is still many years away from reality (at least, according to most predictions), the developers of Taichi suggested in their April paper that their chip's modular architecture meant multiple chiplets could be combined to build an extremely powerful AI system.

They demonstrated this in an experiment, stitching together several Taichi chiplets and comparing its performance to other light-based chips in key areas. The combined system was able to simulate a network of nearly 14 million artificial neurons, which is much larger than the 1.47 million neurons achieved by the next-best design.

It did this while being extremely energy-efficient, performing over 160 trillion operations for every watt of power it used. To put that into perspective, a photonic chip from 2022 could only manage 3 trillion operations per watt, and most conventional chips designed for similar tasks typically perform well under 10 trillion operations per watt.

The researchers said Taichi-II is a key step towards moving light-based AI chips from theory to practical applications and addressing the growing demand for high-power, low-energy computing, SCMP reported. This will be key to developing AGI models, although fears remain over the implications of this.

Owen Hughes

Owen Hughes is a freelance writer and editor specializing in data and digital technologies. Previously a senior editor at ZDNET, Owen has been writing about tech for more than a decade, during which time he has covered everything from AI, cybersecurity and supercomputers to programming languages and public sector IT. Owen is particularly interested in the intersection of technology, life and work ­– in his previous roles at ZDNET and TechRepublic, he wrote extensively about business leadership, digital transformation and the evolving dynamics of remote work.