Google DeepMind's robotic arm can now beat humans at table tennis

A GIF with a series of frames showing a robotic arm playing table tennis against a human
(Image credit: Google DeepMind)

Google's DeepMind can control a robotic arm to beat mere mortals at table tennis, a new study reports. But Fan Zhendong, the 2024 gold medalist for individual and team men's table tennis, can rest easy: The artificial intelligence (AI)-powered robot could only beat mediocre players, and only some of the time, according to the study, which was published Aug. 7 to the preprint database arXiv and has not been peer-reviewed.

Robots can now cook, clean and perform acrobatics, but they struggle to quickly respond to real-world environmental information.

"Achieving human-level performance in terms of accuracy, speed and generality still remains a grand challenge in many domains," the researchers wrote in the study.

Related: 32 times artificial intelligence got it catastrophically wrong

To overcome this limitation, the researchers combined an industrial robot arm with a customized version of DeepMind's ultrapowerful learning algorithm. DeepMind uses neural networks, a layered architecture that mimics how information is processed in the human brain, to gradually learn new information. So far, it has beaten the world's best Go player, predicted the structure of every protein in the body, cracked decades-old mathematics problems and more.

The system was trained to master specific aspects of the game — for instance, learning the rules, creating top spin, delivering forehand serves or using backhand targeting — training on real-world and simulated data in sophisticated algorithms. As the AI learned, the researchers also collected data on its strengths, weaknesses and limitations. Then, they fed this information back to the AI program, thus giving DeepMind's unnamed agent a realistic impression of its abilities. The system then picked which skills or strategies to use in the moment, taking into account its opponent's strengths and weaknesses, just like a human table-tennis player might.

Some highlights - Achieving human level competitive robot table tennis - YouTube Some highlights - Achieving human level competitive robot table tennis - YouTube
Watch On

Then, they pitted their AI-controlled robot against 29 humans. DeepMind's robot arm beat all of the beginners and about 55% of the intermediate players, but it got trounced by advanced players. In an international rating system, it would be a solid amateur player.

DeepMind's robot arm did have some systematic weaknesses, however. For example, it struggled with high balls and, like many of us, found backhand shots more challenging than forehand ones.

Most of the human players seemed to like playing against the system. "Across all skill groups and win rates, players agreed that playing with the robot was 'fun' and 'engaging,' the researchers wrote in the study.

The new approach could be useful for a wide range of applications that call for quick responses in dynamic physical environments, the researchers said.

Tia Ghose
Managing Editor

Tia is the managing editor and was previously a senior writer for Live Science. Her work has appeared in Scientific American, Wired.com and other outlets. She holds a master's degree in bioengineering from the University of Washington, a graduate certificate in science writing from UC Santa Cruz and a bachelor's degree in mechanical engineering from the University of Texas at Austin. Tia was part of a team at the Milwaukee Journal Sentinel that published the Empty Cradles series on preterm births, which won multiple awards, including the 2012 Casey Medal for Meritorious Journalism.