Detecting emotions from facial expressions: not even AI can do this reliably, study concludes

Artificial intelligence (AI) systems are being developed today to infer people’s intentions and reactions by studying their facial expressions. But a new study says such AI guesses can’t be very reliable. A recent study analyzed photos of actors to examine the relationship between facial expressions and human emotions. They found that people can use similar expressions to portray different emotions. Whereas, the same emotion can be expressed in different ways. The survey also found that much of the inference depended on context. Therefore, judging people’s inner thoughts simply by analyzing their facial expressions through an algorithm can be a flawed method.

The researchers marked 13 categories of emotions in which they analyzed the facial expressions of 604 photographs of professional actors. The actors were given scenarios that evoke emotions, to which they would have to react. However, the descriptions in no way suggested how to feel about these scenarios.

The study was published in Nature Communications. The 13 categories were made through the judgment of 839 volunteers and the Facial Action Coding System, which relates certain action units to certain movements of the facial muscles. Machine learning (ML) analyzes revealed to researchers that actors portrayed the same categories of emotions by contorting their faces in different ways. At the same time, similar expressions did not always reveal the same emotions.

The study was carried out in two groups. In one of them, 842 people marked about 30 faces in each of the 13 emotion categories. In the second group, 845 people evaluated about 30 pairs of faces and scenery each. The results of the two groups differed in most cases. This led to the conclusion that analyzing facial expressions out of context can lead to misleading judgments. Therefore, context was important in knowing a person’s emotional intentions.

“Our research goes against the traditional ’emotional AI’ approach,” said Lisa Feldman Barrett, a professor of psychology at Northeastern University College of Science and one of seven researchers behind the study.

The researchers also wrote that these findings “join other recent summaries of empirical evidence to suggest that frowns, smiles and other facial configurations belong to a larger and more variable repertoire of the meaningful ways in which people move their faces to express emotions.”

A few months ago, a researcher looked for regulations on AI tools that were being applied in schools and workplaces to interpret human emotions. Kate Crawford, academic researcher and author of The Atlas of AI, said that “unverified systems” were “used to interpret internal states” and added that this technology needs to be regulated for better policy making and public confidence .

Are the Galaxy Z Fold 3 and Z Flip 3 still made for enthusiasts – or are they good enough for everyone? We discuss this on Orbital, the Gadgets 360 podcast. Orbital is available on Apple Podcasts, Google Podcasts, Spotify, Amazon Music and wherever you get your podcasts.


Leave a Comment