Facial expressions are a powerful means of communicating emotions and intentions to others. In human-robot interaction, facial expressions can be used to indicate one’s current status in a conversation, establish rapport, and convey sociability. For social robots to be more anthropomorphic and for human-robot interaction to be more like human-human interaction, robots need to be able to understand human emotions and appropriately respond to those human emotions.
The robot’s ability to detect and respond to smiles is based on the geometry of facial landmarks. By analyzing the distance between the corners of the mouth, the robot can determine whether a smile is present or not. This information can then be used to trigger a corresponding facial expression in the robot, such as a smile or a frown.