European Commission logo
English English
CORDIS - EU research results
CORDIS

Article Category

Article available in the following languages:

Move over chatbot, here comes the deadbot

University of Cambridge study warns about the consequences of AI chatbots that enable people to talk to the dead.

Society icon Society

From creating a poem to telling a joke, we’re well aware by now of AI chatbots’ boundless possibilities. But digitally resurrecting the dead? Bringing grandma and grandpa back to life is creepy – isn’t it? AI chatbots known as deadbots or griefbots that simulate dead people are already here. This is made possible thanks to AI that mimics the language and personality based on a deceased loved one’s so-called digital footprint. Ethicists at the University of Cambridge are voicing their concerns. A paper published in the journal ‘Philosophy & Technology’ argues that the burgeoning digital afterlife industry could cause long-term psychological harm.

The digital afterlife

“Rapid advancements in generative AI mean that nearly anyone with internet access and some basic knowhow can revive a deceased loved one,” study co-author Dr Katarzyna Nowaczyk-Basińska, researcher at Cambridge’s Leverhulme Centre for the Future of Intelligence (LCFI), told ‘The Guardian’. “This area of AI is an ethical minefield. It’s important to prioritise the dignity of the deceased, and ensure that this isn’t encroached on by financial motives of digital afterlife services, for example.” What if someone buys a deadbot and offers it as a gift for posterity? “[A] person may leave an AI simulation as a farewell gift for loved ones who are not prepared to process their grief in this manner,” stated Dr Nowaczyk-Basińska in a University of Cambridge news release. “The rights of both data donors and those who interact with AI afterlife services should be equally safeguarded.”

Living on as a digital ghost

Can recipients opt out or deactivate the deadbot? “People might develop strong emotional bonds with such simulations, which will make them particularly vulnerable to manipulation,” explained co-author Dr Tomasz Hollanek, also from LCFI. “Methods and even rituals for retiring deadbots in a dignified way should be considered. This may mean a form of digital funeral, for example, or other types of ceremony depending on the social context.” The study calls for design teams to make opt-out protocols a priority. “We recommend design protocols that prevent deadbots being utilised in disrespectful ways, such as for advertising or having an active presence on social media. … It is vital that digital afterlife services consider the rights and consent not just of those they recreate, but those who will have to interact with the simulations,” added Dr Hollanek. “These services run the risk of causing huge distress to people if they are subjected to unwanted digital hauntings from alarmingly accurate AI recreations of those they have lost. The potential psychological effect, particularly at an already difficult time, could be devastating.” “We need to start thinking now about how we mitigate the social and psychological risks of digital immortality, because the technology is already here,” warned Dr Nowaczyk-Basińska.

Keywords

AI, chatbot, deadbot, dead, afterlife, digital afterlife, deceased, griefbot, grief, funeral