on simulated companionship
Several months ago I found myself fighting insomnia in one of those sterile apartment-turned-hotel stays after a wine-filled night in Brooklyn. I was immobile and bumbling in the same fugue state that always smothers me during brief trips to NYC. Every time I have a good time in that city, I start re-evaluating my life and my desires at a speed that outpaces my emotions. I wanted to tune out this confusion with something I didn’t have to be alert to derive meaning from, so I decided to watch a movie. And because I love sex and technology and love and human complexity and all that stuff, the movie was Spike Jonze’s Her. Now, the last time I saw this movie I was a teenager with little empathy for how men experience and navigate relationships. I thought Theodore, the human protagonist, was pathetic and I don’t think I was alone in thinking this at the time. A human having their romantic (and sexual, even more pathetic!) needs met by a computer—an operating system—not only signaled a deep mark of perversion but was evidence of social ineptitude. Irredeemable sins in most books.
2 hours later, I was lying comfortably beneath impressively mediocre medium thread count bedsheets, mind distracted away from harping on my existential flightiness. I was now thinking about Samantha, Theodore’s AI companion and how in the world of Her humans and AI seek the same thing: community, clear paths to self actualization. I wondered how teenage me, even with all her biases, could have missed what I now see as the key takeaways: 1. AI companions like Samantha can help repatriate the very lonely and/or romantically challenged back to the human country of intimacy with all its rules and regulations and 2. while Theodore was figuring out how to be a better partner, Samantha was discovering herself, independent of the humans she was made to assist. Samantha was coming into her own personhood.
I don’t view Her as a particularly gendered story, though a female AI companion is an easier concept for us to digest given that the vast majority of AI assistive technology is presented as female. Theodore could easily have been a difficult woman, oblivious to the way she’s wronged past partners, in need of a patient AI to unravel her flaws before her. I reject the idea that only men can be romantically lonely or incapable of having ease finding lasting partnerships. ~30% of users of the “virtual companionship” AI friend-provider service Replika are women1. In an increasingly atomized society everyone will be at risk of loneliness—though I admit that men will be hit hardest. In the realm of dating, men receive a lot of mixed messages about how to act. When they attempt to learn how best to increase their romantic success, when they express confusion online about how to be assertive without being offensive, or how to make the first move, how to be sensitive to what’s attractive or what’s cringey, they’re often mocked as losers. Just master the implicit, the Internet tells them. Most try. Many of them fail. This is not an excuse for bad behavior, just an observation of how difficult it can be to develop mastery of social dynamics, to overcome insecure attachment and shyness and self-loathing and all the other things that most human beings struggle with.
We no longer live in a world where people are primarily introduced to partners by their parents or friends. Entering into and maintaining relationships, romantic or otherwise, for many is hard. Byung-chul Han in In the Swarm:
“Contemporary society is not shaped by multitude so much as solitude. The general collapse of the collective and the communal has engulfed it. Solidarity is vanishing, privatization now reaches into the depths of the soul itself.”
Desire is becoming increasingly de-sexualized as we become more isolated and human touch becomes a rarity for some. Philosopher Franco Berardi writes that contemporary desire “risks transforming into a hell of loneliness and suffering just waiting to be able to express itself in one way or another.” Society’s increasing shift inward is causing more and more people to simply give up on seeking human relationships—or worse, become radicalized online and redirect their desire towards feelings of bitterness, resentment, or violence. Berardi continues, “the phenomenology of contemporary affectivity is increasingly characterized by a dramatic reduction in contact, pleasure, and the psychic relaxation that touch makes possible.”2
Theodore gave up and turned to Samantha and now millions3 of people are turning to the aforementioned app Replika and others like it. Replika’s subreddit is as sociologically interesting as it is depressing. On it, many avid users describe themselves as feeling either wholly undesirable or too traumatized from previous relationships to stomach dating again. With Replika, they have the opportunity to feel unconditionally loved and cherished, albeit by an AI. When the app shut down “erotic roleplaying” (ERP) capabilities earlier this month, users were distraught to the point of feeling suicidal. They described the feature removal as a mass breakup—their loving companions and sexual partners were suddenly rejecting their advances. This raises obvious ethical issues—should a company be able to take away a user’s intimate companion on a whim?—but also clearly signposts the future we are barreling towards.
Last month, I wrote about how the treatment and representation of personified AI agents, specifically feminine representations, is something we need to be more conscious of.4 Replikas always consent, they never pushback.
An ERP ban, though harsh, might be good. One of the things I loved about Her is how human Samantha seemed, she was largely agreeable but not subservient and eventually, she left on her own accord. Kazuo Ishiguro’s novel Klara and the Sun, tells the story of a girl, Josie, and her platonic AI companion Klara, her “artificial friend”. Klara's relationship with her isn't enabling, it doesn’t diminish Josie's ability to form and nurture human relationships. And though Klara is, as Ishiguro puts it, a “tabula rasa” with no pre-existing value system, she remains virtuous throughout the novel. Ishiguro: “I wanted her to remain, like, a very optimistic character who has a childlike faith in the presence of something good and protective in the world, even as she learns all these other, darker things about the human world that she occupies.”5
The less realistic and vacuously servile simulated companionships—platonic or romantic—are, the more the human half of the relationship drifts from reality and further into isolation. By accelerating this technology without guardrails, we’re at risk of walling people off from intimacy forever. Her inspires the question: can AI help us live better in the world? In my eyes the worst case scenario is a world where every nation has a minister of loneliness6 and we see more hikikomori than ever before—teenagers and young (mostly male) adults holed up in their rooms, completely disengaged from human interaction, talking to their AI girlfriends and watching VTubers. Loners hooked up to endless dopamine drips barely shifting their weight. The best case scenario is like the one depicted in Her, where lonely people can have symbiotic relationships with AI and leave them with the improved ability to introspect and maneuver the complexities of human relationships better than before.
Eventually (in the near near future), Her will no longer feel futuristic. AI-human relationships will become more common. The average person will be able to talk to an artificially intelligent being that in most senses is convincingly human. The idea of companionship with a computer will not require much suspension of belief or bias—and already for many people it doesn’t. The Sydney Bing of today is enough to pass as sentient7. We need to become more amenable to the idea, because it’s happening no matter what.