Ghostbots: AI versions of deceased loved ones could be a serious threat to mental health


PTI, Mar 15, 2024, 11:55 AM IST

Representative image (Source: iStock)

We all experience loss and grief. Imagine, though, that you don’t need to say goodbye to your loved ones. That you can recreate them virtually so you can have conversations and find out how they’re feeling.

For Kim Kardashian’s fortieth birthday, her then husband, Kanye West, gifted her with a hologram of her dead father, Robert Kardashian. Reportedly, Kim Kardashian reacted with disbelief and joy to the virtual appearance of her father at her birthday party. Being able to see a long-dead, much missed loved one, moving and talking again might offer comfort to those left behind.

After all, resurrecting a deceased loved one might seem miraculous – and possibly more than a little creepy – but what’s the impact on our health? Are AI ghosts a help or hindrance to the grieving process?

As a psychotherapist researching how AI technology can be used to enhance therapeutic interventions, I’m intrigued by the advent of ghostbots. But I’m also more than a little concerned about the potential effects of this technology on the mental health of those using it, especially those who are grieving. Resurrecting dead people as avatars has the potential to cause more harm than good, perpetuating even more confusion, stress, depression, paranoia and, in some cases, psychosis.

Recent developments in artificial intelligence (AI) have led to the creation of ChatGPT and other chatbots that can allow users to have sophisticated human like conversations.

Using deep fake technology, AI software can create an interactive virtual representation of a deceased person by using their digital content such as photographs, emails, and videos.

Some of these creations were just themes in science fiction fantasy only a few years ago but now they are a scientific reality.

Help or hindrance?

Digital ghosts could be a comfort to the bereaved by helping them to reconnect with lost loved ones. They could provide an opportunity for the user to say some things or ask questions they never got a chance to when the now deceased person was alive.

But the ghostbots’ uncanny resemblance to a lost loved one may not be as positive as it sounds. Research suggests that deathbots should be used only as a temporary aid to mourning to avoid potentially harmful emotional dependence on the technology.

AI ghosts could be harmful for people’s mental health by interfering with the grief process.

Grief takes time and there are many different stages that can take place over many years. When newly bereaved, those experiencing grief might think of their deceased loved one frequently. They might freshly recall old memories and it is quite common for a grieving person to dream more intensely about their lost loved one.

The psychoanalyst Sigmund Freud was concerned with how human beings respond to the experience of loss. He pointed out potential added difficulties for those grieving if there’s negativity surrounding a death.

For example, if a person had ambivalent feelings towards someone and they died, the person could be left with a sense of guilt. Or if a person died in horrific circumstances such as a murder, a grieving person might find it more difficult to accept it this.

Freud referred to this as “melancholia”, but it can also be referred to as “complicated grief”. In some extreme cases, a person may experience apparitions and hallucinate that they see the dead person and begin to believe they are alive. AI ghostbots could further traumatise someone experiencing complicated grief and may exacerbate associated problems such as hallucinations.

Chatbot horror

There are also risks that these ghost-bots could say harmful things or give bad advice to someone in mourning. Similar generative software such as ChatGPT chatbots are already widely criticised for giving misinformation to users.

Imagine if the AI technology went rogue and started to make inappropriate remarks to the user – a situation experienced by journalist Kevin Roose in 2023 when a Bing chatbot tried to get him to leave his wife. It would be very hurtful if a deceased father was conjured up as an AI ghost by a son or daughter to hear comments that they weren’t loved or liked or weren’t their father’s favourite.

Or, in a more extreme scenario, if the ghostbot suggested the user join them in death or they should kill or harm someone. This may sound like a plot from a horror film but it’s not so far fetched. In 2023, the UK’s Labour party outlined a law to prevent the training of AI to incite violence.

This was a response to the attempted assassination of the Queen earlier in the year by a man who was encouraged by his chatbot girlfriend, with whom he had an “emotional and sexual” relationship.

The creators of ChatGPT currently acknowledge that the software makes errors and is still not fully reliable because it fabricates information. Who knows how a person’s texts, emails or videos will be interpreted and what content will be generated by this AI technology?

In any event, it appears that no matter how far this technology advances, there will be a need for considerable oversight and human supervision.

Forgetting is healthy

This latest tech says a lot about our digital culture of infinite possibilities with no limits.

Data can be stored on the cloud indefinitely and everything is retrievable and nothing truly deleted or destroyed. Forgetting is an important element of healthy grief but in order to forget, people will need to find new and meaningful ways of remembering the deceased person.

Anniversaries play a key role in helping those who are mourning to not only remember lost loved ones, but they are also opportunities to represent the loss in new ways. Rituals and symbols can mark the end of something that can allow humans to properly remember in order to properly forget.

by Nigel Mulligan, Assistant Professor in Psychotherapy, School of Nursing, Psychotherapy and Community Health, Dublin City University (The Conversation)

Udayavani is now on Telegram. Click here to join our channel and stay updated with the latest news.

Top News

Karnataka Women’s Commission gives Kannada film industry 15 days to come up with POSH plan

Those filled with hate not leaving any chance to defame the country: PM Modi’s veiled dig at Rahul Gandhi

Kejriwal holds one on one meeting with AAP’s PAC members to elect next Delhi CM

Jobs ready for youths in J-K holding laptops and tricolour, jail for stone pelters: Amit Shah

Udayavani.com “Nammane Krishna”: Broadcast of 6th Prize-Winning Reel

Multi-tier security arrangements put up in J-K ahead of assembly polls

Vokkaliga ministers, MLAs meet CM Siddaramaiah, seek strict action against BJP MLA Munirathna

Related Articles More

Chatting with ChatGPT found to soften beliefs of conspiracy theorists

India successfully test fires two Surface to Air Missiles in 2 days from Odisha

IIT-Delhi sets up 6-member committee to look into bus fire incidents in state-run buses

An Ed-tech tragedy: UNESCO experts debate consequences of Covid-induced shift to tech in education

Moon’s seismic activity may be from meteorite impact or heat, more studies needed: ISRO

MUST WATCH

Hearing problems in newborn’s

EAT RAJA

Santhekatte underpass road issues

Communal clash in Nagamangala Ganesha procession |

Sri Subramanya Swamy Sannadhi


Latest Additions

Karnataka Women’s Commission gives Kannada film industry 15 days to come up with POSH plan

Those filled with hate not leaving any chance to defame the country: PM Modi’s veiled dig at Rahul Gandhi

Kejriwal holds one on one meeting with AAP’s PAC members to elect next Delhi CM

Jobs ready for youths in J-K holding laptops and tricolour, jail for stone pelters: Amit Shah

Udayavani.com “Nammane Krishna”: Broadcast of 6th Prize-Winning Reel

Thanks for visiting Udayavani

You seem to have an Ad Blocker on.
To continue reading, please turn it off or whitelist Udayavani.