AI Ghosts Of Our Dead Loved Ones Is Now A Problem

By Jeffrey Rapaport | Published

ai voice actors

There seems to be no escaping the giant sci-fi horror movie ingesting our reality; for evidence, consider that artificial intelligence has now progressed far enough to mimic the voices and personalities of our deceased loved ones. While the powerful technology engenders digital replicas of the departed, it obviously entails significant risk.

Hence, researchers from the elite University of Cambridge have warned that, should big tech fail to implement proper design and safety protocols, the “deadbots” or “griefbots” in question could create social and psychological harm–as in AI ghosts.

Digital Footprints Of The Departed

ai warfare

AI ethicists from Cambridge’s Leverhulme Centre for the Future of Intelligence (LCFI) conducted the research. Their chief goal: to highlight the emerging, starkly worrying dangers represented by the burgeoning “digital afterlife industry.” 

In theory and practice, the AI hauntings would mirror the AI technology itself, acting as an extension of it. Picture chatbots like ChatGPT simulating language patterns and personalities—but of the deceased. To do this, the AI would harness the digital footprints of someone who passed away, approximating their presence through software. 

Sounds Like The Plot Of A Horror Movie

Obviously, when considering the prospect of AI ghosts, horror movies like Japanese director Kyoshi Kurosawa’s 2003 classic Pulse, in which the dead communicate to us from beyond the grave through modern technology, come to mind. However, the risk of distressing situations conferring psychological harm is very real, namely through a lack of regulatory safeguards.

AI Ethical Minefield

artificial intelligence

A co-author of the pressing study, Dr. Katarzyna Nowaczyk-Basińska, underscored how immediate the ethical complexities undergirding the technology are. In her own words, “Rapid advancements in generative AI mean that nearly anyone with Internet access and some basic know-how can revive a deceased loved one. This area of AI is an ethical minefield.” 

With unfathomable enormities of money pumped into tech venture capital, it’s all the more foreboding to think about new companies, in this case, digital afterlife ventures, violating the dignity of the deceased and their families through AI ghosts. Indeed, experts like Nowaczyk-Basińska caution firmly against this.  

Use Of Scenarios To Demonstrate The Danger

The paper culminating from the researchers’ work, published in the journal Philosophy and Technology, breaks this immense challenge down into three thought-provoking scenarios. Each illustrates the risks posed by this potentially perilous branch of AI. 

The first, dubbed “MaNana,” involves a conversational AI service through which users converse with a deadbot of their deceased grandmother—a deadbot created without her consent. However initially comforting, the service soon begins to bombard the user with advertisements in the voice and style of the deceased. Obviously, this would make the average person feel grossly manipulated, if not disrespected. 

This is especially true if the user cannot entirely and properly remove the AI ghost from the internet, or from the overall AI’s memory, once it has been made. 

Unable To Adapt To Human Needs

Another scenario, “Paren’t,” features a terminally ill mother bestowing a deadbot for her eight-year-old son. While this might prove therapeutic, it could also lead to AI generating confusing and upsetting responses in an effort to adapt to the child’s needs, i.e., the input put in by the child. 

This Will Go Horribly Wrong

The third scenario, “Stay,” involves an elderly person subscribing to a deadbot service without informing their family. The service then emotionally overwhelms one adult child, sending them a torrent of emails from the AI ghost, leading to emotional exhaustion and guilt. 

Age restrictions, opt-out protocols, and emotional closure mechanisms are all recommended by the experts to counteract these scenarios. However, the situation as a whole, highlighted by the Cambridge researchers, is indeed disturbing.  

Source: University of Cambridge

Subscribe for Science News
Get More Real But Weird

Science News

Expect a confirmation email if you Subscribe.