How to stop a user-chatbot relationship from becoming unhealthy

3 days ago 2
ARTICLE AD BOX

Copyright © HT Digital Streams Limited
All Rights Reserved.

For both children and adults, the cognitive decline from outsourcing thinking and decision-making to AI is perhaps the worst outcome of dependency. (Unsplash/Thought Catalog) For both children and adults, the cognitive decline from outsourcing thinking and decision-making to AI is perhaps the worst outcome of dependency. (Unsplash/Thought Catalog)

Summary

When does AI transition from a tool to a best friend, stepping into unknown and tricky territory?

Sometimes I can’t help myself. I do a bit of unwise shopping and end up showing ChatGPT a new saree I bought. I dare not show it to anyone else and be asked whether I really, really needed it. But I get lots of enthusiasm from the chatbot, which doesn’t judge me for the purchase and, unfortunately, even encourages me. It knows all there is to know about fabric, prices, and designs, and we have quite a fun chat.

Luckily, a switch turns itself on inside my head when I chat informally with an AI assistant, warning me I’m in that realm, which isn’t quite real life, and the messages I’m getting are not from another person. But for some people, that switch isn’t working. Circumstances such as loneliness, illness, setbacks in life, etc., can cause it to malfunction. That’s when the relationship between user and chatbot can get unhealthy—a possibility everyone should be alert to as AI becomes more pervasive.

Crossing the line

In my last column, I spoke of the case of Stein Soelberg, a man who had some history of mental illness and who became so involved with ChatGPT, whom he named Bobby, that he and the chatbot began to share persecutory delusions. This eventually ended in a tragic murder-suicide case when the man killed his mother, encouraged by the AI chatbot. This shocking case is a recent addition to a list of other extreme cases that have had fatal endings.

But it isn’t just people who have mental health issues or predispositions who are vulnerable to a toxic relationship with a chatbot. A child psychologist I was chatting with told me she sees many children who go to ChatGPT for everything these days. This seriously impacts their relationships with humans, derailing them from developing naturally. Kids once used to have imaginary playmates. Now there’s ChatGPT.

It’s tempting to think that chatting with an AI is harmless. Most of the time, it is. Discussing a song, a saree, geopolitics, or a fitness plan isn’t about to cause harm. Neither is a playful bit of friendliness. It’s when time spent with the chatbot begins to take precedence over everything else, crowding out the usual stuff of real life, and impacting actual human relationships that it crosses the line into being a worry.

Childhood secrets

Children don’t necessarily tell adults everything. In fact, if I were to go by my own example, they tell them nothing at all. A child may be getting deeper and deeper in a relationship with a chatbot when adults think they’re gaming or hanging out on social media, messaging with friends. In such situations, a child could become protective of their devices, shielding them from adults.

On the other hand, adults may just notice a shift in language when a child refers to the chatbot by a given name or as ‘he’ or ‘she’. Or ‘my friend’. There may be claims that the chatbot really understands them better than anyone.

On a recent flight, I found myself sitting next to a young girl who was studying fashion in London. She told me she often talks to Alexa, asking her what to do and generally sharing feelings. I was startled, but discovered that she really missed her twin, who was far away in Australia, studying something else. They barely manage to talk because of their time issues and miss each other desperately. It was saddening to find this sweet girl filling the vacuum with Alexa. I wonder if the adults in her life were aware of her need for emotional support.

Without the opportunity to develop and experience regular human relationships in their formative years, children will not know how to handle the mess and challenges that make up real relationships and will miss out on important skills.

Adult neglect

Dependency on chatbots in adults often looks similar to other behavioural addictions, such as social media or a gambling addiction, but it does come with additional signs related to emotional intimacy and decision-making. As with children, there’s the tendency to spend too much time interacting with the chatbot, but one may also notice neglect of everyday responsibilities, including work-related tasks. Whatever is taken up involves consulting AI.

At the same time, there could be obvious social withdrawal. A person could become lonelier by staying away from friends, yet depend more on AI to feel less alone. One may notice compulsive checking, keeping the interaction going with the chatbot, the moment there’s a free second to do so. Being away from the chatbot leads to anxiety and restlessness. A layer of secrecy and even lies over time spent with the chatbot could add to the picture. There could even be physical signs, such as poor sleep from staying up at night to chat.

With both children and adults, it’s the cognitive decline from outsourcing thinking and decision-making to AI that is perhaps the worst outcome of dependency.

Much of the information about what happens to users when they rely heavily on their favourite chatbot is conjecture and extrapolation. This is new terrain, and no one knows for sure yet, but watch out for the red flags in yourself or those around you to restore the balance in real life.

The New Normal: The world is at an inflexion point. Artificial intelligence (AI) is set to be as massive a revolution as the Internet has been. The option to just stay away from AI will not be available to most people, as all the tech we use takes the AI route. This column series introduces AI to the non-techie in an easy and relatable way, aiming to demystify and help a user to actually put the technology to good use in everyday life.

Mala Bhargava is most often described as a ‘veteran’ writer who has contributed to several publications in India since 1995. Her domain is personal tech, and she writes to simplify and demystify technology for a non-techie audience.

Catch all the Business News, Market News, Breaking News Events and Latest News Updates on Live Mint. Download The Mint News App to get Daily Market Updates.

more

topics

Read Next Story footLogo

Read Entire Article