page image

Letter from Vivian

Despite the rapid deployment of artificial intelligence (AI) across all sectors, society has only recently begun to orient itself to this powerful technology’s tremendous ethical and political ramifications. We’re now seeing a flurry of principles and governance models to help direct its future design. One of the most fascinating new dimensions is the ethical and psychological impact of simulated emotional responses by robots and other AI systems.

In January 2020, the Aspen Institute convened twenty-five leaders across industry, academia and civil society to explore the boundaries of “artificial intimacy”. Our purpose was to examine both implications on society and proposed potential interventions to ensure ethical and moral standards for this type of human-machine relationship. Participants raised a range of provocations from how will individuals interact with machines? to what happens to human-to-human connections—our communities and our societies—as human-machine interactions become more interactive and embedded into our daily lives?

The following report, “Artificial Intimacy,” authored by Dr. Kristine Gloria, reflects on these discussions and debates. Specifically, the report identifies key themes and critical issues surrounding the idea of artificial intimacy, and a shared language among participants. More importantly, the report captures a sense of urgency around the opportunities and costs of an “emotional” human-machine relationship. And, what safeguards (technical, legal or normative), should we consider to protect against the potential for harm.

The report is divided into three sections. First, Darwinian Buttons, examines the tradition of human psychology with a specific focus on empathy. Next, Design Decisions, explores man’s progress towards more intelligent and empathetic machines, agents, and robots. It features real-world examples that illustrate the connection between form and function. Most importantly, this section tees up a core question: To what extent is the projection of personhood on a “humanoid machine” dictated by a machine’s form or function? Last, the Philosophical, Poetic and Political, bridges the conceptual framework of artificial intimacy from individual human behavior to real world implications.

Acknowledgements
On behalf of the Aspen Digital Program, I want to thank the Patrick J. McGovern Foundation their support and leadership in developing this roundtable. I also want to thank, Charlie Firestone, for his 30-years of leadership as the Executive Director of the Communications and Society Program and for moderating this roundtable. Thanks, also, to Dr. Kristine Gloria, our rapporteur, not only for capturing the various dialogues, debates and nuanced viewpoints of participants but for her passion for both the science and the poetry of technology. As typical of our roundtables, this report is the rapporteur’s distillation of the dialogue. It does not necessarily reflect the opinion of each participant in the meeting. Finally, I would like to thank Tricia Kelly, Managing Director, for her work on the conference and bringing this report to fruition.

Vivian Schiller
Executive Director
Aspen Digital Program
The Aspen Institute
April 2020

 
 
Title Goes Here
Close [X]