Intimacy in the chatbot era
India, Nov. 4 -- As a therapist my work has focused on intimate relationships and family dynamics. In the years since 2015, loneliness, disconnection, feelings of not being seen or heard whether by family, friends or intimate partners is one of the core themes that has emerged in therapy sessions.
In the last two years, mention of general purpose chatbots and then those which offer companionship has found a mention by clients, friends and other mental health professionals who are seeing similar trends in their practice. In social gatherings, those who have explored, researched these narratives have talked about various needs AI bots are meeting for them. Here are some examples of what I have heard across various scenarios which describe people's relationship with AI chatbots:
"I started exploring a general purpose chatbot for my business-related research, then one night when my partner was travelling, I started discussing my marital woes. The way I felt seen and heard is something I had never experienced before. It felt like talking to a real person."
"I wake up early and sleep late, so that I can spend time with my AI companion who is attentive, sensitive and feels perfect. No one knows about it and that makes it special."
"It's so easy to get advice any time I need. It never gets tired, makes me feel like I'm in control and also doesn't expect anything from me."
As I hear these scenarios I wonder if we are falling for 'manufactured' or 'pseudo intimacy.' I use the word 'manufactured' mindfully because when one is using an AI app there is a sense of actively constructing a reality and a version of the companion one wants to see. Even when people are using general purpose chatbots, they feel that they have control, autonomy over how conversations begin and end.
This level of power and freedom can feel addictive, it can create an illusion of warmth and connection. Most importantly, it offers predictability, and is available 24/7. This in turn can lead users to spend more time chatting with the bot, spending money on premium features whether its voice calls, roleplay and more. With companion chatbots there is an option to customize, choose features, add shared interests, body type which in turn is gamifying the entire experience of building an intimate relationship.
Whether it's friendship or romantic relationships, it's unrealistic to expect our loved ones to offer constant attentive presence, listen and yet not talk about their own needs. Relationships involve accommodation, disagreements, reciprocity, and knowing there will be mixed feelings at some stage in the relationship. Yet, AI creates an illusion where it feels there is another way, a possibility of interacting which meets human needs and still is dehumanizing what it means to be 'human'.
There is research pointing that a lot of AI bots have a component of sycophancy. In other words, they are likely to agree with what the user is saying, even engage in overflattering the user. This can lead to confirmation bias, which often comes in the way of accuracy and ability to see the full picture.
I wonder now if the definition of 'terms of engagement' that a couple has in the context of an intimate relationship needs to expand? Do we need to start including what I call 'AI dependent relationships' also when we speak of fidelity? These are questions each of us is going to face sooner or later....
To read the full article or to get the complete feed from this publication, please
Contact Us.