In recent years, countless computer scientists worldwide have been developing deep neural network-based models that can predict people’s emotions based on their facial expressions. Most of the models developed so far, however, merely detect primary emotional states such as anger, happiness and sadness, rather than more subtle aspects of human emotion.
Past psychology research, on the other hand, has delineated numerous dimensions of emotion, for instance, introducing measures such as valence (i.e., how positive an emotional display is) and arousal (i.e., how calm or excited someone is while expressing an emotion). While estimating valence and arousal simply by looking at people’s faces is easy for most humans, it can be challenging for machines.
Researchers at Samsung AI and Imperial College London have recently developed a deep-neural-network-based system that can estimate emotional valence and arousal with high levels of accuracy simply by analyzing images of human faces taken in everyday settings. This model, presented in a paper published in Nature Machine Intelligence, can make predictions fairly quickly, which means that it could be used to detect subtle qualities of emotion in real time (e.g., from snapshots of CCTV cameras).
Comments are closed.