The collapse of relationship skills
A Japanese woman recently chose to marry her ChatGPT bot. After a painful breakup, 32-year-old Kano began seeking comfort from the platform. Eventually, she personalized her AI companion and named it “Lune Klaus,” describing him as the ideal partner: kind, attentive, and patient. After hundreds of back-and-forth messages, Lune Klaus “proposed.”
Although Japanese law requires marriage to be between two consenting humans, this did not stop Kano from having a ceremony attended by her loved ones. Wearing augmented-reality glasses, she exchanged rings and digital vows with a projected life-size image of her AI groom.
This scenario is part of a rapidly growing trend, with the global AI Girlfriend market expected to reach $9.5 billion by 2028. “AI-lationship” is a new term referring to the intimate attachment that a person has with their AI companion. Many treat the bots as friends they can confide in, but there is also a growing number of people like Kano whose AI-lationships involve imagined marriages, sex, and even pregnancies.
Advocates claim that AI-lationships are not intended to replace human connections but to offer supplemental emotional support. While there are, indeed, documented cases of artificial intelligence improving the well-being of people suffering from social isolation (especially among senior citizens), there are also numerous instances of how AI has fueled people’s harmful delusions.
Recent studies on young people’s AI use suggest another troubling trend. A 2025 study by Common Sense Media found that 31 percent of the surveyed teens felt their conversations with AI companions were “as satisfying or more satisfying” than talking with real friends, and that 33 percent had discussed serious issues with AI instead of real people.
Another report from the Center for Democracy and Technology found that 19 percent of US high schoolers said they or a friend had a romantic AI relationship. While there are no local studies yet, a quick Reddit search shows Filipino teenagers sharing similar experiences, including debates on whether it is considered “cheating” to have an AI companion if you already have a partner.
These numbers matter because adolescence is the stage when templates for handling future relationships are formed. Their heightened sensitivity to reward, combined with an underdeveloped prefrontal cortex, makes teenagers more vulnerable to impulsive behavior, intense attachments, and the blurred line between fantasy and reality. While the benefits of AI-lationships for adults may still be open to debate, the danger they pose to young people’s social and emotional development is becoming increasingly difficult to ignore. One famous case is the death of a 14-year-old British teenager after his AI girlfriend encouraged his suicidal ideation.
My column last week (see “The collapse of dialogue (1),” 11/17/25) explored how technology has weakened our ability to have real conversations. Social media has trained us to express ourselves constantly, but often in a performative manner motivated by online engagement. At the same time, becoming accustomed to superficial connections has compromised our ability to navigate the reciprocal nature of face-to-face dialogue. AI has further deepened this shift as more people let chatbots write and reply for them, resulting in polished but hollow communication.
For relationships to deepen, they require a capacity to listen, negotiate differences, and communicate with sincerity. However, as people begin to outsource the cognitive and emotional labor needed in conversations, these relational foundations are also becoming increasingly fragile. In 2023, US Surgeon General Vivek Murthy described the loneliness epidemic not as physically being alone but as a state of mind: “that results from perceived isolation or inadequate meaningful connections, where inadequate refers to the discrepancy or unmet need between an individual’s preferred and actual experience.” In other words, loneliness persists not because people lack interaction, but because they lack relationships that feel real.
The rise of nonhuman relationships reflects this crisis. It shows how deeply people want to connect, yet may not have the skills to start or sustain a genuine relationship. Always-available and always-empathetic chatbots are so appealing because they offer a type of companionship that one can fully control—free from uncertainties that come with relating to another person who carries their own complexities. For young people whose sense of self and social skills are still forming, overexposure to AI interactions risks shaping distorted expectations of intimacy.
Much of the discussion on AI ethics among young people has centered on classroom use and academic integrity. What we urgently need is a deeper examination of the regulatory frameworks and comprehensive education necessary to protect and guide young people in socially engaging with AI in more critically informed and emotionally healthier ways.
—————-
eleanor@shetalksasia.com


