Emotional AI and EdTech: new paper!

Screenshot 2019-11-06 at 08.44.53.png

New paper day here :)

I’m very pleased to announce that ‘Emotional AI and EdTech: serving the public good?’ has been published with Learning, Media and Technology. Linky.

Exploring emotional AI and empathic technologies in an educational context, it approaches this with an open mind. As I’ve argued elsewhere, the technologies themselves have scope to do social good, entertain and serve. However, this paper flags problems in education contexts. First is technological capability, the next normative and the last legal. Notwithstanding problems of reverse inference associated with approaches such as facial coding, their use on children brings unique problems. Children’s emoting does not work the same way as adults, nor conveniently fit into facial action coding system boxes. Children misbehaving, eh? At a normative level, there is debate to be had. Use of technology to enhance learning and wellbeing is in principle good, but how should this should be balanced with privacy, dignity and the right not to be commercially exploited? See paper for more on this. Likewise for data protection, edtech raises all sorts of problems – not least around minimisation and necessity.

Distilled, the paper finds:

(1) Serious questions about effectiveness, validity and representativeness of training data;

(2) That financial incentives and the wellbeing of school-children do not align;

(3) That mining the emotional lives of children is normatively wrong, especially when the value extraction does not serve the wellbeing of those children;

(4) That it is problematic to use inferences about children’s emotions to train neural networks deployed for other commercial purposes (such as advertising);

(5) Scope for mission creep where in-class data may be used for other socially determining purposes (such as social scoring);

(6) Misgivings regarding the social desirability of chilling effects in the classroom;

(7) Need to ask questions regarding data minimisation principles and whether emotional AI is necessary for a successful education;

(8) Contravention of the creepiness principle.

Andrew McStay