
Another sticky problem in robotics and computing, is developing robots that can talk naturally. Natural language has proven so complex that we can’t program machines to do it because we don’t entirely understand how it works ourselves. One thing that I’ve suggested in casual conversations is that perhaps the best way to get machines to talk would be to have them develop and learn their own language and then teach them to translate to our languages.
As it goes, whenever you come up with an idea, there’s most likely someone working on the same somewhere in the world. And such is the case at Sony’s Computer Lab in Paris. Their robots perform various actions with their bodies in front of a mirror and give new actions a name. They then interact with the other robots, 20 of them in all, to discover that they have named the same actions. The robots adjust their vocabulary until they reach agreement on specific terms. They have proven remarkably adept at doing this and have even developed relatively complex concepts such as “left” and “right”. Also, they develop their language so rapidly that the researchers have had trouble keeping up often needing up to a week to decipher the robots language. Now we just wait and see if the robots can figure out how to translate their language to ours…
Bonus ill-structured problem: Pedagogy – robots – language?