Robot Madness: Creating True Artificial Intelligence
In Robot Madness, LiveScience examines humanoid robots and cybernetic enhancement of humans, as well as the exciting and sometimes frightening convergence of it all. Return for a new episode each Monday, Wednesday and Friday through April 6.
Artificial intelligence in the form of Deep Blue may have beaten human chess champions, but don't expect robots to fetch you a beer from the fridge just yet.
Robotic artificial intelligence (AI) mainly excels at formal logic, which allows it to sift through thousands of Web sites to match your Google search, or find the right chess move from hundreds of previous games. That becomes a different story when AI struggles to connect that abstract logic with real-world meanings, such as those associated with "beer" or "fridge handle."
"People realized at some point that you can only get so far with a logical approach," said Matt Berlin, an AI researcher with MIT's Media Lab. "At some point these symbols have to be connected to the world."
A robot fetching a beer has to realize that it should go to the fridge, figure out where the handle is and how to open the fridge door, and distinguish between beer cans and soda cans. It should know not to crush the beer can in its grasp. Finally, it should know that handing a beer over isn't the same as dropping the can in someone's lap, Berlin noted.
Even the most painstaking lines of logic can't convey actual understanding of what each step means in the real world, unless robots can perceive that world and learn from their experiences.
{{ video="LS_090309_01_SensLrn" title="Sensational Learning: Robot Minds Grow By Feel" caption="Turns out a brain needs a body to make a mind. Robots must learn how to conceptualize "feelings" by touching, hearing and seeing for themselves. We cannot teach them. Credit: Thomas Lucas, Producer / Rob Goldberg, Writer" }}
Sign up for the Live Science daily newsletter now
Get the world’s most fascinating discoveries delivered straight to your inbox.
"People learn what a word means in a truly grounded way," Berlin told LiveScience. Researchers around the world are trying to replicate the human perception that permits such learning, which means building things such as robotic hands that can feel what they grasp.
One major challenge is getting robots to see the world as well as people.
"As humans, we can detect where there's shadows, colors and objects," said Chad Jenkins, a robotics expert at Brown University. "That has proven extremely difficult for robots."
Jenkins is working on a robot that can respond to nonverbal commands, such as gestures. His research group took a bomb-disposal PackBot that's normally controlled by a human soldier, and hard-coded it to understand gesture commands such as "follow," "halt," "wait" and "door breach."
The upgraded PackBot has a camera that provides depth perception, meaning that the robot can easily extract and follow the silhouette of a person against any background. Eventually, Jenkins hopes that a soldier could "train" PackBot by performing certain gestures and telling the robot to remember.
That hints at a future where each person can easily supervise their own robot team, with each robot having different forms and capabilities, Jenkins said.
But humans need not fret during the wait for their robotic "Jeeves." New technology promises upgrades for people as well.
Robot Madness Episode 3:Human Becomes 'Eyeborg'
- Video - Sensational Learning: Robot Minds Grow By Feel
- Robot Madness Episode 1: Preventing Insurrection of Machines
- More Robot News and Information