Berlin, IEEE, #P7014 and The Sprint

Screenshot 2019-10-04 at 08.41.27.png

This week was spent in Berlin debating the ethics of ‘empathic technologies’ that sense and interact with human affects and emotion. The goal of us IEEE P7014 group members was to think about how to design global standards for organisations working with data about emotions/affect that sit in relation to human rights and regional law. How does one do this, given the myriad of use cases? Some uses are entirely benign, some terrifying and others currently freaky, but likely to become routine. And, given this is emergent, how does one anticipate the unanticipatable? Also, what is the purpose of a “standard” that is entirely voluntary – what’s the point?

 

The warm-up

Around 16 of the 60 or so Working Group were able to make it in-person. Dominated by a disproportionate amount of people from North America and Europe, members also came from Malawi, Turkey, Israel, India and Japan, but it quickly became clear that one has to proceed in a highly reflexive way (i.e. keenly mindful of who is not in the room, as well who is). Balance and better representation is the ideal, but seemingly not feasible for this endeavour. One for us governance academics to think about, for sure. Of course, even with all regions represented, this does not necessarily mean all of its people are. More needs to be done on gender though, an easier fix.

In early meetings in the week, ethicists, lawyers, engineers and social scientists debated and explored ethics and use cases. Regionality featured, such as the UDHR not applying everywhere, but what was most interesting for me was how differently people think. Engineers thought in visual, clear, schematic, graphical and actionable terms, while the “woolly folk” :) (such as myself) were finding the [important] outliers that bucked the system. People are different, who knew?! It really is incumbent on those of us working with ethics and governance to argue clearly and simply without falling into meta-banalities, yet keeping dignity and human flourishing in view. Fair to say, we did a lot of talking but not much actual writing. Was this going to be an expensive ‘talking shop’? Afterall, IEEE had spent big to get all P7000 groups in one space and very busy people had given up a week.

 

The sprint

It turns out not. Like much creative work, there’s mess at the outset, questioning of whether anything will come to pass, lows, but elation as order emerges. Building on the chaos, within three hours of the morning on the final day we had a draft structure, capacity to host and incorporate future disagreements, and clear findings and recommendations that methodologically and ethically will advance how we govern the production, deployment and use of empathic technologies.

On value and why, the activity P7014 and the wider P7000 series (that is addressing ethics and technology) matters. There are lots of reasons why, but perhaps foremost is that laws differ around the world and do not address the detail of specifics of technologies and business cultures. Compliance with regional law is good (and a start in some cases!), but we can do better. IEEE standards can play a key role here, as well as providing regional data protection authorities insight on how to address these technologies. It’s worth acknowledging the role of focus too; for sure the discussion around AI ethics is needed when human rights and law take primacy (there’s a lot though, I know!), but the value of P7014 work is that it is focused on one key aspect of AI. This will make recommendations and standards more meaningful.

It was a productive and heartening week that was made possible by human connection, debate, talking, but also knowing when to stay quiet.  Overall though, empathy really is a thing, that is, the capacity to approximate the views of others and appreciate perspective. Time will tell (but hopefully not too much!) whether we can sort the simulational and technological version of this.

Andrew McStay