New open access book: 'Automating Empathy Decoding Technologies that Gauge Intimate Life'
Finally! Three years plus in the writing, it’s here, McStay’s Automating Empathy, now out with Oxford University Press. Available in hardback and paperback from all good bookshops and online sellers, it’s also open access here. You’re welcome!
Click the open access link for the table of content (and download and read at leisure). The overall motive is to consider the pros and many cons of applications that claim in some way to be empathetic, or to interact with human emotion. It builds on work introduced in McStay’s Emotional AI to consider the implications of a technological environment that pertains to ‘feel-into’ into its inhabitants.
Finding this premise to be epistemologically problematic, the book highlights the role of ethics, advancing a ‘hybrid’ approach to questions of technology and ethics, which starts from the position that people are entangled in new technologies.
The book is pluralistic, attending to philosophies well-equipped to deal with questions of what is collectively good for society in relation to technologies that interact with intimate dimensions of human life.
With early chapters introducing recurrent arguments and positions, the second part of the book addresses technologies and organisational uses. These include education and uses of automated empathy in classrooms and online settings; cars and transport, where cameras and other sensors gauge states such as fatigue, anger, and other emotions, and where cars themselves are design to feel; usage in the workplace, including assessment of bodies and voices in physical sites of work and online settings, and gauging of emotion through proxies in gig work settings; development of brain–computer interfaces that have scope to impact on multiple aspects of everyday life; and finally, renewed interest in enabling people to sell data about themselves.
The book concludes automated empathy and its deployment needs to be inverted if systems are to function ethically.