What is Emotional AI?

Emotional derives from affective computing techniques and advances in machine learning and artificial intelligence (AI). This is a weak form of AI in that these technologies aim to read and react to emotions through text, voice, computer vision and biometric sensing, but they do not have sentience or emotional states themselves.

How?

The following techniques are used to try to sense and discern people’s emotions and expressions:

Analysis of online sentiment which analyses online language, emojis, images and video for evidence of moods, feelings and emotions. 

Facial coding of expressions: the effectiveness of its methodology is debatable, but this analyses faces from a camera feed, a recorded video file, a video frame stream or a photo.

Voice analytics: the emphasis is less on analysing natural spoken language (what people say), but how they say it. These include elements such as the rate of speech, increases and decreases in pauses, and tone.

Eye-tracking: this measures point of gaze, eye position and eye movement.

Wearable devices sense galvanic skin responses (GSR) that indicate emotional stimulation; muscle activity and muscle tension (electromyography); heart rate (blood volume pulse); skin temperature; heart beats and rhythms (lectrocardiogram); respiration rate; skull caps and headwear to measure brain activity (electroencephalography):

Gesture and behaviour: cameras are used to track hands, face and other parts of the body said to communicate particular messages and emotional reactions.

Virtual Reality (VR): here people cede themselves to a synthetic environment. It allows remote viewers to understand and feel-into what the wearer is experiencing. Headwear may also contain EEG sensors and sensors to register facial movement.

Augmented Reality (AR): this is where reality is overlaid with additional computer-generated input (such as graphics). Remote viewers can track attention, reactions and interaction with digital objects.

 

Machines that learn

The rise of technical interest in emotional life is indivisible from the increase in applications of AI and machine learning methods. While it is true these technologies are undergoing a hype cycle, this should not detract from the fact that they are here to stay. The key connection with emotions is that machine-learning techniques allow computers to extrapolate patterns from emotional behaviour, classify and adapt to new circumstances. Input data might be facial expressions, voice samples, biofeedback data or in-world VR behaviour, and the output features are likely to be classified emotional states.

 

Machine-readable human life: OK or not?

As many of the publications contained on this site point out, there are potential benefits of these technologies. Simply put, stuff works better. Gaming for example benefits from biometrics inputs. On the other hand, what is proposed is to make human emotional life machine-readable by a range of devices and contexts for diverse purposes (e.g. personal, commercial and security). Is this OK? If so, on what terms? If not, precisely why? See section on critical issues.

 

 

 

 

 

 

 

 

 

 

 

[1] Russell, S. and Norvig, P. (2010) Artificial Intelligence: A Modern Approach (3rd ed). Englewood Cliffs, NJ: Prentice Hall. p. viii

1*CqgxDdrA82ENjNS9ARwZ1Q.png