Key to faster 6G speeds lies in letting new AI architecture take control, scientists say
Scientists are developing AI models that analyze wireless traffic as a whole, making high-speed networks such as 6G more rapid and reliable for users of cell phones and other mobile devices.

Scientists are developing artificial intelligence (AI) models that could help next-generation wireless networks such as 6G deliver faster and more reliable connections.
In a study that featured in December 2024’s edition of IEEE Transactions on Wireless Communications, researchers detailed an AI system which reduces the amount of information that needs to be sent between a device and a wireless base station — such as a cell tower — by focusing on key information such as angles, delays and signal strength.
By optimizing signal data in wireless networks that use high-frequency millimeter-wave (mmWave bands of the electromagnetic spectrum, the researchers found that connectivity errors were significantly reduced, and the AI system improved data reliability and connectivity in diverse environments, such as in urban areas with moving traffic and pedestrians.
"To address the rapidly growing data demand in next-generation wireless networks, it is essential to leverage the abundant frequency resource in the mmWave bands," said the lead author of the study, Byungju Lee, a professor in the telecommunications department at Incheon National University, South Korea.
Related: Future wearable devices could draw power through your body using background 6G cellphone signals
"Our method ensures precise beamforming, which allows signals to connect seamlessly with devices, even when users are in motion," said Lee.
Smarter ways to shape waves
The current challenge for networks that use high-frequency radio spectrum like mmWaves is that they rely on a large group of antennas working together through massive multiple-input multiple-output (MIMO). The process needs precise information — referred to as "channel state information” (CSI) — to deliver connectivity between base stations and mobile devices with compatible antennas.
Sign up for the Live Science daily newsletter now
Get the world’s most fascinating discoveries delivered straight to your inbox.
This situation is further complicated by changes to a network's environment, such as antennas moving with people and traffic, or obstructions in the line of sight between devices and cell towers. This leads to “channel aging” – a mismatch between the predicted channel state and its actual state, which results in degraded performance such as reduced data throughput and signal quality.
To try and overcome such challenges, the study’s authors used a new kind of AI model known as a transformer. Convolutional neural networks (CNNs) can be used to help predict and optimize wireless network traffic, by recognizing signal patterns and classification.
But the researchers took a different approach: by using a transformer model instead of a CNN in their network analysis method, both short- and long-term patterns in signal changes could be tracked. As a result, the AI system, dubbed "transformer-assisted parametric CSI feedback", could make real-time adjustments in the wireless network to improve the connection quality between a base station and a user, even if the latter was moving quickly.
The improvement is explained by the difference between CNNs and transformers. Both are neural network models that analyze visual patterns such as images — in this case, patterns on the electromagnetic spectrum — but CNNs tend to be trained on smaller datasets and focus on "local" features, whereas transformer models use larger datasets and have a self-attention mechanism that enables them to determine the importance of different input elements and their relationships at a global and local level.
In simple terms, a transformer model will learn about an image as a whole, while a CNN has a bias toward features like edges and textures. Transformers see the bigger picture, so to speak.
However, transformer models are more computationally demanding than CNNs. But if they can deliver robust next-generation wireless networks, they could be the key to high-speed wireless communication in the near future.
Roland Moore-Colyer is a freelance writer for Live Science and managing editor at consumer tech publication TechRadar, running the Mobile Computing vertical. At TechRadar, one of the U.K. and U.S.’ largest consumer technology websites, he focuses on smartphones and tablets. But beyond that, he taps into more than a decade of writing experience to bring people stories that cover electric vehicles (EVs), the evolution and practical use of artificial intelligence (AI), mixed reality products and use cases, and the evolution of computing both on a macro level and from a consumer angle.
You must confirm your public display name before commenting
Please logout and then login again, you will then be prompted to enter your display name.