A new android infant has been born thanks to the University of California San Diego’s Machine Perception Lab. The lab received funding from the National Science Foundation to contract Kokoro Co. Ltd. and Hanson Robotics, two companies that specialize in building lifelike animatronics and androids, to build a replicant based on a one year old baby. The resulting robot, which has been a couple of years in development, has finally been completed – and you can watch it smile and make cute faces.
With high definition cameras in the eyes, Diego San sees people, gestures, expressions, and uses AI modeled on human babies, to learn from people, the way that a baby hypothetically would. The facial expressions are important to establish a relationship, and communicate intuitively to people. As much a work of art as technology and science, this represents a step forward in the development of emotionally relevant robotics, building on previous work of David Hanson with the Machine Perception Lab such as the emotionally responsive Einstein shown at TED in 2009 (here another video).
In 1970, the robotics professor Masahiro Mori coined the term uncanny valley, a hypothesis in the field of robotics and 3D computer animation, which holds that when human replicas look and act almost, but not perfectly, like actual human beings, it causes a response of revulsion among human observers. The “valley” refers to the dip in a graph of the comfort level of humans as a function of a robot‘s human likeness. The hypothesis has been linked to Ernst Jentsch‘s concept of “the uncanny” identified in a 1906 essay, “On the Psychology of the Uncanny” Jentsch’s conception was elaborated by Sigmund Freud in a 1919 essay entitled “The Uncanny” (“Das Unheimliche“).
What I would say is that basic research is done to be used in a myriad of ways, so that can serve humans best.
And certainly this very advanced research in robotic expressions can help us to be closer to something as cute as Gumdrop, the 27-year old Bulgarian robot-actress.