Roboticists put a lot of emphasis on building robots that connect with us emotionally, with the goal of facilitating more symbiotic relationships. But usually that work is put into creating robots that can communicate their “emotions.” In the biological world, that’s only one piece of the puzzle. For humans to truly engage with robots on an emotional level, they need to be able to respond to our emotions as well. That’s why Vernon Stanley Albayeros Duarte, a student at Universitat Autònoma de Barcelona, built a face-following robot that reacts to human emotion.
Albayeros Duarte was inspired to build this robot by the Luxo Jr. lamp that appears in Pixar’s logo and short films. Originally, he wanted the robot to be able to hop around like Luxo Jr. does, but that proved to be too difficult of an engineering challenge for this project. What he did accomplish, however, was making the robot feel like a pet. The robot is built around the Arduino-based LittleArm 2C robot arm, and was given a camera “eye” and an Adafruit NeoPixel LED ring. It looks around the room, and when it sees someone it starts staring at them. Then, depending on what emotions that person is displaying, it will react in a handful of different ways.
The computer vision is handled in two parts: the face-following portion and the emotion-recognition portion. The face-tracking is done with an OpenCV script that runs on a Raspberry Pi. The emotion classifier is offloaded to Google’s Cloud Vision API. After the robot determines what emotion the person is displaying it will react in one of four ways: for joy it will bop around, for anger it will shake and move quickly, for sorrow it droops down and looks up at you, and for surprise it moves backward as much as it can. The LED ring also changes color to mirror those emotions. The result is a robot that acts more like your dog than it does a conventional robot, which is what we’ll likely need to interact with robots on a regular basis.