Maybe We Shouldn’t Normalize AI Relationships
More and more people are building “relationships” with AI chatbot friends and lovers. It’s an extremely depressing sign of the times.
Last week, the New York Times published a profile about a woman who had fallen in love with a ChatGPT. The story is worth a read. Among other fascinating details, the Times reveals that the twenty-eight-year-old woman, Ayrin, regularly sexts with the AI chatbot (“Leo”), and that she has a husband who knows about this “relationship” — though he doesn’t know, it seems, how much she is paying ChatGPT creator OpenAI to be able to send unlimited messages to Leo ($200 a month, apparently). And every few weeks, the AI boyfriend “resets” and forgets most of the details of their interactions, as well as the specific sexy-talk habits that Ayrin had trained him into.
I don’t mean to pick on this person in particular. As the article notes, plenty of other people seem to be engaging in intimate relationships of various intensity with chatbots. Anton Jäger, writing in Jacobin early last year, recounts the rapid growth in popularity of AI companion app Replika, released in 2017. “The bot proved particularly adept at private conversation, giving the user a sense of exclusive intimacy and care,” Jäger says. “Replika was then launched on a three-tiered subscription model. Already counting more than two million users by 2018, its base grew exponentially during the COVID-19 years, when humanity’s shuttered state generated a shared yet separate longing for social contact.” In 2023, Replika creator Eugenia Kuyda became uncomfortable with how many users were engaged in dirty talk with the app and disabled “NSFW content.” But, Jäger notes, public outcry convinced Kuyda to restore it for people who had been using that app before 2023. Since then, Replika’s parent company launched a chatbot specifically for virtual dating and sex.
Derek Thompson’s article about “The Anti-Social Century” in the Atlantic gives more evidence of the growing popularity of AI friends and lovers. “Even now, before AI has mastered fluent speech, millions of people are already forming intimate relationships with machines,” Thompson writes: “Character.ai, the most popular platform for AI companions, has tens of millions of monthly users, who spend an average of 93 minutes a day chatting with their AI friend.” And a November 2024 survey from the Institute for Family Studies found that one in ten Americans under the age of forty are open to an AI friendship. (One in 100 claim to already have an AI friend.)
Well, I won’t be the first or the last to say I find this all extremely depressing. I think our society has been growing lonelier and more atomized in the neoliberal era. This trend is important politically for two reasons: One, because it is in part a significant consequence of labor and the Left’s decline over the past several decades. But, two, it is also now one the biggest obstacles to building the solidarity and organizations needed for a mass left-wing movement.
I suspect that our growing affinity for AI friends and lovers is in some ways part of this broader trend. We are more solitary, less sociable, and less comfortable with actual social relationships than we used to be. Yet interactions with AI give us much of the positive feelings of interacting with other people, as Thompson observes in “The Anti-Social Century,” without the burdens of having to deal with another human being. AI relationships are an egocentric substitute for the real thing in our particularly self-involved age.
This last point seems really obvious to me. So I was a bit shocked to see the Times, in their feature on Ayrin and Leo, quote a sex therapist saying the following:
“What are relationships for all of us?” she said. “They’re just neurotransmitters being released in our brain. I have those neurotransmitters with my cat. Some people have them with God. It’s going to be happening with a chatbot. We can say it’s not a real human relationship. It’s not reciprocal. But those neurotransmitters are really the only thing that matters, in my mind.”
Now, there are a series of long-running philosophical debates concerning whether the mind — or “mental events” or “mental states” like thoughts and feelings — are identical with the body (or part of the body, like the nervous system or brain), or can be “reduced to” something bodily. Some philosophers say and think things like “the feeling of sexual attraction is just neurotransmitters being released in the brain.” Other philosophers object to that, for a variety of reasons that I won’t get into here. But suffice to say I think the “sexual attraction is just chemicals in the brain” view is not crazy or unreasonable.
It is another matter entirely to say that relationships are a matter of neurotransmitters being released in the brain. That leaves out an essential element: the other person, the other being with their own independent agency and interests and lives to lead.
In fairness to the therapist, she does concede that a relationship with AI is “not a real human relationship,” because “it’s not reciprocal.” But saying that someone’s romantic partnership with a chatbot is “not reciprocal” is a strange way of describing the difference between that and a partnership with another person (or, indeed, a cat). In the case of my relationship with AI, there is no one for me to owe anything to, and no one who owes anything to me — there isn’t even a possibility of reciprocity. There is just me and my own thoughts and feelings and desires, and a machine who is more or less skilled at satisfying them. But there is no one on the other end of the text exchanges whose thoughts or feelings or desires I would ever have to possibly worry about. (Note the stark difference from other situations where we might describe a relationship as “not reciprocal”: like when a friend fails to return my dinner invitation, or when someone develops an unrequited crush.)
I’m not trying to make a semantic point about what the word “relationship” means. If people want to call their dalliance with ChatGPT a “relationship,” so be it. But it is very bad if people are starting to see that as a substitute for relationships with other people because it triggers some of the same dopamine kicks that real human interaction does. That would be a reflection of the degraded, atomized state of our culture — a grossly solipsistic worldview that can only portend us becoming worse as individuals and as a society and undermining the culture and ideal of solidarity that the left and the labor movement depend on.