Robots could be taught to recognise human emotions from our movements, a new study shows. Researchers found humans could recognise excitement, sadness, aggression, and boredom from the way people moved, even if they could not see their facial expressions or hear their voice. Their findings suggest robots could learn to use the same movements, alongside facial expressions and tone of voice, to recognise human internal states.It raises the prospect that robots already used to teach second languages could recognise when students are bored and customer service robots could identify when people feel angry or stressed. Dr Charlotte Edmunds, from Warwick Business School , said: “One of the main goals in the field of human-robot interaction is to create machines that can recognise human emotions and respond accordingly.“Our results suggest it is reasonable to expect a machine learning algorithm, and consequently a robot, to recognise a range of emotions and social interactions using movement