Technology has provided us with many facsimiles for human life. Computer simulations, artificial intelligence programs, and automated chat-bots all give us the illusion of a communicating thinking being. But what is intelligence? Is it the ability to take pre-programmed information and run through a series of responses in a predictable but human manner? Many believe that life isn’t simply a series of programmed responses, but rather a complex system of learning and adapting to new situations. And now one robot – a three foot tall open source research project called iCub – is drawing on human psychology as it not only adapts, but actually learns like a human baby.
Outrageous claims about learning behavior have been made in the past by those creating technological marvels. One of the things that’s difficult to see in the public spectrum is the considerable amount of programming that goes into helping a robot through a given situation. These must almost always be protocols designed while the machine is hooked into a computer. While they seem perfectly capable of responding o a changing environment, one need only spend a few days around robots to fully understand how limited they are deep down.
A simple example of this was evident in 2007 when the Honda ASIMO robot, one of the greatest technological wonders of the first ten years of the millennium attempted the simple task of ascending a flight of stairs during an expose in Japan. Something in the robot’s programming didn’t adapt sufficiently to the situation and it actually took a spill, falling down several stairs in an embarrassing display for the robot’s designers. Fortunately, the damage to the robot was minimal, but it did publicly demonstrate some of the limitations robots would have to overcome. The number of man-hours required to make a robot function in even the most basic capacity is daunting. And so if robots are going to become what we imagine for them, we will have to find new ways of showing them how to learn.
And learn is the key word in this latest technology. Video of the devices speaking and observing seem rudimentary, perhaps even to the point of taking a leap backwards in technology. But what’s going on behind the robot’s eyes is the key to this next step. These machines don’t just act out programmed scripts. They learn and adapt, meaning they are in the early stages of a new revolution in artificial intelligence.
The monotone voice of the small baby-shaped device with its arms outstretched can be more than a little unnerving, especially when coupled with the fact that the iCub is actually responding to its environment in ways that it has learned. So why give it a body at all? This is based on a theory known as embodied cognition. Researchers involved in the project think that artificial intelligence doesn’t work as well as we hope when it’s limited to a simple abstraction of circuits whose only input is that of a keyboard. Instead, if a robot can move around and see the world around it, it will be better able to form the rudimentary relationships necessary to understand simple concepts.