robot-smiling-with-blue-eyebrows-1024x768.jpg

What is Emotional AI?

Emotional AI entails machines that use affective computing and AI techniques to sense, ‘feel-into’, learn and interact with human emotional life

Indeed, in as far as AI systems interact with people, it is reasonable to say that AI agents have no value until they are sensitive to feelings, emotions and intentions. 

How?

If we begin by understanding that AI is about reactivity, we see that machines first have to sense emotions if they are to react appropriately. The following sensing techniques allow intelligent agents to discern and sense people’s emotions. These include:

Analysis of online sentiment which analyses online language, emojis, images and video for evidence of moods, feelings and emotions. 

Facial coding of expressions: this analyses faces from a camera feed, a recorded video file, a video frame stream or a photo.

Voice analytics: the emphasis is less on analysing natural spoken language (what people say), but how they say it. These include elements such as the rate of speech, increases and decreases in pauses, and tone.

Eye-tracking: this measures point of gaze, eye position and eye movement.

Wearable devices sense galvanic skin responses (GSR) that indicate emotional stimulation; muscle activity and muscle tension (electromyography); heart rate (blood volume pulse); skin temperature; heart beats and rhythms (lectrocardiogram); respiration rate; skull caps and headwear to measure brain activity (electroencephalography):

Gesture and behaviour: cameras are used to track hands, face and other parts of the body said to communicate particular messages and emotional reactions.

Virtual Reality (VR): here people cede themselves to a synthetic environment. It allows remote viewers to understand and feel-into what the wearer is experiencing. Headwear may also contain EEG sensors and sensors to register facial movement.

Augmented Reality (AR): this is where reality is overlaid with additional computer-generated input (such as graphics). Remote viewers can track attention, reactions and interaction with digital objects.

 

Machines that learn

The rise of technical interest in emotional life is indivisible from the increase in applications of AI and machine learning methods. While it is true these technologies are undergoing a hype cycle, this should not detract from the fact that they are here to stay. The key connection with emotions is that machine-learning techniques allow computers to extrapolate patterns from emotional behaviour, classify and adapt to new circumstances. This means that software applications become more accurate in predicting and making judgements about emotional behaviour without being explicitly programmed. As per the list above, input features might be facial expressions, voice samples, biofeedback data or in-world VR behaviour, and the output features are likely to be classified emotional states. Just think of how this might apply to media content, digital agents, devices and things we encounter in our environments!

 

Machine-readable human life: OK or not?

As many of the publications contained on this site point out, this comes with benefits. Simply put, stuff works better. On the other hand what is proposed is making human emotional life machine-readable. Is this OK? If so, on what terms? If not, precisely why? See section on critical issues.

 

 

 

 

 

 

 

 

 

 

 

[1] Russell, S. and Norvig, P. (2010) Artificial Intelligence: A Modern Approach (3rd ed). Englewood Cliffs, NJ: Prentice Hall. p. viii