AI ethicists from the University of Cambridge have expressed grave concerns about the psychological impact of AI chatbots designed to simulate conversations with deceased loved ones and the potential for them to digitally "haunt" people. 

Family Grandparent

(Photo: Tung Lam from Pixabay)

The Rise of Deadbots

These technologies, known as 'Deadbots' or 'Griefbots,' use artificial intelligence to mimic the language and personality of the deceased based on their digital footprint. 

Companies offering these services tout them to provide a postmortem presence, enabling users to engage with AI versions of lost family members or friends.

Researchers from Cambridge's Leverhulme Centre for the Future of Intelligence (LCFI) highlight significant ethical challenges posed by the development of these technologies, particularly due to the absence of robust safety standards and regulations.  

"Stalking by the Dead"

One concerning scenario described by researchers involves companies using deadbots to subtly promote products or services, exploiting vulnerable individuals for financial gain under the guise of a deceased loved one. 

Additionally, the researchers caution about the potential for surviving family members to receive unsolicited notifications, reminders, or advertisements from AI chatbots, effectively creating a form of digital "stalking by the dead."

While some individuals may initially find solace in interacting with AI versions of their deceased loved ones, researchers warn that continuous emotional interactions could become overwhelming over time. 

Moreover, individuals may feel powerless to suspend or terminate these AI simulations if the deceased had previously signed lengthy contracts with digital afterlife services.

Dr. Katarzyna Nowaczyk-Basińska, a study co-author, underscores the ethical complexities inherent in recreating deceased individuals using AI. She points out concerns regarding the dignified treatment of the deceased and the potential exploitation of grieving individuals for financial motives.

The researcher notes that the emergence of platforms offering AI recreations of the deceased underscores the need for stringent design protocols and ethical guidelines to safeguard the rights and dignity of both data donors and users.

Read Also: AI Priest Gets Demoted After Saying Babies Can Be Baptized with Gatorade, Making Other Wild Claims

Digital Hauntings From Dead Loved Ones

Companies like 'Project December' and 'HereAfter' offer services utilizing AI to recreate deceased individuals. These platforms leverage advanced AI models to generate text and voice-based interactions that mimic the personality and communication style of the deceased.

Another co-author, Dr. Tomasz Hollanek, emphasizes the emotional vulnerability of individuals forming strong bonds with AI simulations of deceased loved ones. He advocates for dignified methods and rituals to retire these AI chatbots, proposing digital funerals or other cultural ceremonies.

The researchers advocate for design protocols preventing the exploitation or misuse of AI chatbots for advertising purposes. They stress transparency and informed consent in developing and deploying digital afterlife services.

"These services run the risk of causing huge distress to people if they are subjected to unwanted digital hauntings from alarmingly accurate AI recreations of those they have lost. The potential psychological effect, particularly at an already difficult time, could be devastating," Hollanek said in a press release.

"We need to start thinking now about how we mitigate the social and psychological risks of digital immortality, because the technology is already here," Nowaczyk-Basińska added.

The findings of the researchers were published in the journal Philosophy & Technology. 

Related Article: LinkedIn Co-Founder Reid Hoffman Interviews His Extremely Human-like AI Clone

Byline


ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion