Artificial emotional intelligence: AI starting to recognise emotions

BERLIN: Picking up on tone of voice, body language, what’s left unsaid – these are all part of being what’s called a good listener.

Not everybody has the wit or subtlety or emotional intelligence to notice what are sometimes non-verbal cues: we all have a friend or colleague for whom the penny doesn’t drop, who often is slow to read the room.

Such tone-deafness could be about to get shown up even more, going by research carried out the Max Planck Institute, which suggests that artificial intelligence systems could be within range of the best of us when it comes to listening for and sussing out feelings.

“[M]achine learning can be used to recognise emotions from audio clips as short as 1.5 seconds,” said Hannes Diemerling of the Max Planck Institute for Human Development, who said the bots showed “an accuracy similar to humans” when it came to hints of joy, anger, sadness, fear, disgust and neutrality in speech.

Diemerling and colleagues came to their conclusions after devising tests they believe shows whether or not machine-learning models could “accurately recognise emotions regardless of language, cultural nuances, and semantic content.”

Using Canadian and German datasets, they came up with “nonsense” sentences arranged into clips cut to a second and a half, which they say is “how long humans need to recognise emotion in speech” and the briefest viable time-span to control for “overlapping of emotions.”

Without addressing the real-life nuance of overlap, the team conceded that their investigation, which was published in the journal Frontiers in Psychology, had “some limitations.”

“Actor-spoken sample sentences” possibly do not get across “the full spectrum of real, spontaneous emotion,” they found, listing challenges posed by using different languages and datasets. – dpa

Tagged