AI Companions

AI companions are increasingly present in everyday life, used for conversation, advice, emotional support and romantic/sexual purposes. Our research explores what platforms people (incl. adults and children) are using, how people are using them, what they think of them, and how they should be governed.

See below for academic papers, policy reports, and work we are leading on international standards that guide the responsible design, deployment, and oversight of these systems.

Coming soon!

2026 Report: Do AI Companions Understand?

Most UK Teens Say Yes

 
 

Standards

Over the past two years we have chaired and worked with IEEE and its volunteers to develop a technical standard to directly address the overlap between emulated empathy and human-AI partnering, including AI companions, agents, assistants, social chatbots, and robots/physical AI. Snappily titled as the Recommended Practice for Ethical Considerations of Emulated Empathy in Partner-based General-Purpose Artificial Intelligence Systems, this will be out in Q1 or Q2 2026.

 

White papers

V.Bakir & A.McStay, “Is Deception in Emulated Empathy Innately Bad?” IEEE Xplore, 13 Dec. 2024.

V.Bakir; K.Bennet; B.Bland; A.Laffer; P.Li; A.McStay, "When is Deception OK? Developing the IEEE Recommended Practice for Ethical Considerations of Emulated Empathy in Partner-based General- Purpose Artificial Intelligence Systems", IEEE ISTAS, Cholula, Sept. 2024.

A.McStay; F.Andres; B.Bland; A.Laffer; P.Li; S. Shimo, "Ethics and Empathy-Based Human-AI Partnering: Exploring the Extent to which Cultural Differences Matter When Developing an Ethical Technical Standard," IEEE Xplore vol., no., pp.1-28, 28 Aug. 2024.