Robots could learn the same way babies do

Babies learn by touching things, playing around, and watching. But robots learn only when developers write some lines of code, or when someone manually models a desired movement on the robot. So computer scientists paired up with developmental psychologists to make teaching baby bots a little more like teaching baby humans.
 
“They are the best learners on the planet, why not design robots that learn as effortlessly as a child?” The team of researchers built computer algorithms based on infant research studies and then put them to the test with robots. There are two prongs to testing this idea: a gaze-based experiment done with a computer simulation, and an experiment that has an actual robot imitate a human.
 
In the gaze scenario, a simulated robot is taught the mechanics of how its head moves, and watches a human move its head. The robot then uses its new knowledge to move its head too, so it’s looking in the same direction as the human. In another test, the robot is taught about blindfolds, and how they make it impossible to see. With that newfound knowledge, the robot decides to not look in the direction where a blindfolded human is "gazing."
 
In the imitation experiment, the robot would watch a human pick something up from a table, and understanding what the goal was, would either mimic the human exactly, or find an easier way to pick up the object. These two different experiments are basic, but the team plans to find a way to teach robots about more complicated tasks as well.
 
“Babies learn through their own play and by watching others,” says Andrew Meltzoff, psychology professor and collaborator on this research, in the press release. “They are the best learners on the planet, why not design robots that learn as effortlessly as a child?”