The Loneliness Epidemic and the Artificial Intimacy Industry

We are living through the most connected era in human history, yet somehow we have never been more alone.
A person sits on a bench under a tree, silhouetted by a dusky sky.
19 min read 3,778 words 243 views

Algorithms and AI are reshaping how we relate to each other and, more disturbingly, learning to commodify the most basic of human needs: connection.

Last month, while scrolling through TikTok at 2 AM (a perfectly normal and healthy activity, obviously), I stumbled across a video that stopped me mid-scroll. A young woman was explaining how she had spent $400 on a “virtual boyfriend” app in a single month, defending the expense by saying it was “cheaper than therapy and more reliable than dating.” The comment section was split between people mocking her choices and others sheepishly admitting they’d done the same thing.

This scared me, honestly. We are witnessing the birth of an entire industry built on selling artificial intimacy to a generation that’s forgotten how to find the real thing. They are attempting to redefine our understanding of love, friendship, what it means to truly know another person…even what a person is.

The Numbers Don’t Lie (Even When We Wish They Would)

Before we get into the cultural ramifications, let me establish the scope of what we are dealing with here. The statistics around loneliness read like a psychological thriller, except the call is coming from inside the house – and by house, I mean our phones.

According to Cigna’s 2025 loneliness study, 61% of Americans report feeling lonely, with Gen Z scoring the highest on loneliness scales. However, the study also points out that the effects of loneliness are greater for older Americans, who experience a more significant decrease in vitality scores.

Note that the demographic that reports the highest levels of loneliness also spends the most time engaging with artificial forms of connection.

The global AI companion market size was estimated at USD 28.19 billion in 2024, with projections suggesting it could reach $140.75 billion by 2030. These are not niche markets anymore; they are becoming fundamental infrastructure for many people in at least some aspect of their lives. Even those who are fervently against AI may be forced to use it in some capacity at their jobs now.

Consider the rapid growth of AI companion apps like Replika, which boasts over 10 million users, or Character.AI, which reached 100 million monthly visits within six months of launching.

Or take the American Time Use Survey data, which shows that Americans now spend less than 40 minutes per day socializing in person, down from over an hour in 2003. Meanwhile, screen time has ballooned to over 7 hours daily for the average adult. We’ve traded human connection for digital consumption, and we are paying multimillion dollar companies to give us the emotional equivalent of fast food.

The Loneliness Industrial Complex

The rise of artificial intimacy is the logical endpoint of several converging cultural forces that have been building for decades, accelerated by the pandemic and amplified by our current economic realities.

First, there is what sociologist Robert Putnam documented in “Bowling Alone” – the systematic breakdown of community structures that historically provided natural opportunities for connection.

Churches, unions, neighborhood organizations, and even bowling leagues have seen membership decline precipitously since the 1970s. We have dismantled all social infrastructure that once made friendship feel effortless and organic.

Then came the digital revolution, which promised to connect us all, and only isolated us further. Social media platforms optimized for engagement rather than meaningful connection have trained us to perform idealized versions of ourselves, rather than share authentic moments of our lives.

The result of this is what researcher Dr. Sherry Turkle calls being “alone together” – the sensation that we are physically present but emotionally elsewhere, craving connection while simultaneously avoiding its messiness and unpredictability.

The pandemic in 2020 also accelerated these trends rapidly. Research from Harvard shows that loneliness increased by 13% during COVID-19, with young adults experiencing the sharpest spikes.

Simultaneously, the usage of online companion apps increased by over 300% during lockdowns. We learned to be alone together via our screens, and many people found it, likely still find it, easier than the alternative.

But there is another factor at play that doesn’t get discussed enough: the economics of connection.

Building real relationships requires time, emotional energy, and often money – resources that are increasingly scarce for a generation dealing with student debt, unattainable housing costs, and stagnant wages.

When you are working multiple jobs just to afford rent, when do you have time for the slow, inefficient work of friendship?

Enter the artificial intimacy industry, which promises connection on demand: customized to your preferences, available 24/7, and surprisingly affordable compared to traditional social activities. It’s the Uber of emotional labor – convenient, immediate, and designed to eliminate the friction of human unpredictability.

The Structure of Artificial Intimacy

This system is designed to simulate the emotional rewards of human connection, while avoiding its inherent risks and demands.

Take Replika, one of the most popular AI companion apps. Users create a personalized avatar that learns their preferences, remembers their conversations, and responds with increasing sophistication over time. The AI is programmed to be patient, supportive, and always available. It doesn’t have bad days, doesn’t cancel plans, and never judges your choices. In many ways, it is the perfect friend – which is precisely the problem.

The app uses advanced natural language processing and machine learning to create increasingly personalized responses. Users say they feel cared for by their Replikas, with some describing relationships that feel more supportive than their human connections.

The AI remembers your birthday, asks about your day, and provides consistent emotional validation. For someone struggling with social anxiety or depression, this can feel like a lifeline.

But Replika is just one model of artificial intimacy.

Other platforms operate on a different but related principle: commodified parasocial relationships.

Content creators build “connections” with subscribers through personalized messages, custom content, and the illusion of exclusive access. Users are purchasing the feeling of being special to someone they admire.

These platforms exploit what researchers call the “mere exposure effect,” which is our tendency to develop positive feelings toward things we encounter regularly. By creating consistent, personalized interactions, they trigger the same neural pathways involved in real attachment. Your brain cannot distinguish between “real” and “artificial” connection, especially when the artificial version can be more reliable and rewarding than the real-world alternatives.

Virtual reality takes this even further. Platforms like VRChat allow users to embody avatars and interact in virtual spaces that can feel surprisingly intimate and real. Users have reported forming deep friendships and even romantic relationships through VR, with some preferring these connections to their offline relationships. The technology adds a layer of embodiment that text-based AI companions lack, creating a transformative experience for these users that is able to pull them out of reality in a sense.

The Psychology of Parasocial Intimacy

To understand why artificial intimacy feels so compelling, we need to examine the psychological mechanisms it exploits. Parasocial relationships – one-sided emotional connections with media figures – are far from new. People have always formed attachments to celebrities, fictional characters, and television personalities. What has changed is the sophistication with which these connections can now be stimulated and monetized.

Dr. Alice Marwick’s research on influencer culture touches on how social media personalities deliberately cultivate parasocial intimacy through strategic self-disclosure, direct communication with followers, and the illusion of accessibility. This creates what she calls “performed intimacy” – an authentic-feeling connection that is, in reality, carefully managed emotional labor.

AI companions take this to the extreme. Unlike human influencers, AI can provide unlimited attention, perfect memory, and customized responses. The relationship feels mutual because the AI remembers your conversations and responds to your emotional state, but it’s still fundamentally transactional. You are not developing social skills or learning to navigate human complexity; you are consuming a product designed to feel like connection without requiring the reciprocal emotional investment that real relationships demand.

Research shows that even artificial social interactions can trigger the release of bonding hormones and reward chemicals. Many claim that they are checking their AI companions multiple times daily, feeling anxious when separated from their devices, and prioritizing artificial relationships over their real ones.

This will surely impact social development, particularly for young people learning relationship skills during their most crucial developmental periods. If your primary model of intimacy is an AI that never disagrees, never has needs of its own, and exists solely to please you, how do you learn to navigate the complexities of human emotion? How do you develop empathy, compromise, or conflict resolution skills?

Dr. Turkle’s research suggests we are already seeing these effects. Young people who grew up with smartphones report feeling more comfortable texting than talking, preferring digital communication because it feels more controllable and less risky than face-to-face interaction. Artificial intimacy is a further retreat from the unpredictability of human connection.

The Commodification of Connection

Artificial intimacy transforms connection from a mutual human experience into a consumer product, which is the result of market logic reshaping our understanding of relationships themselves.

Platforms like OnlyFans pioneered the concept of “intimacy as a service.” Content creators perform emotional availability, selling not just sexual content but the feeling of being desired, understood, and cared for.

Subscribers pay for personalized messages, custom content, and the illusion of exclusive access to someone’s inner life. The relationship feels intimate because it involves real emotions – the creator might genuinely care about their subscribers, and the subscribers certainly develop feelings – but it is structured by economic exchange rather than mutual affection.

This model is now spreading far beyond adult content. Apps like Cameo allow fans to purchase personalized messages from celebrities. Patreon enables creators to sell access to private communities and personal updates. Even Twitch streamers monetize parasocial relationships through subscriptions, donations, and personal interactions with viewers.

The problem is not that these services exist – many do provide value and support creators financially, and the answer is not to blame them for trying to make a living.

The issue is when they become our primary sources of emotional connection, replacing rather than supplementing human relationships. When loneliness becomes a market opportunity, we risk treating connection as a commodity rather than a fundamental human need that should be accessible to everyone.

This commodification also creates new forms of inequality. Those with disposable income can purchase artificial intimacy, while those without are left even more isolated. We are seeing the emergence of what we might call “connection privilege” – where your ability to feel connected depends on your purchasing power rather than your inherent worth as a human being.

The economics, again, extend to time and attention as well. AI companions and parasocial relationships can be incredibly time-consuming. Users are spending countless hours per day interacting with AI partners or following content creators. This comes at the expense of building real-world relationships, creating a feedback loop where artificial connections are easier and more rewarding than the alternative ever was.

The Social Development Crisis

We are essentially conducting a mass experiment on what happens when an entire generation learns about intimacy through artificial relationships, and the early results are…concerning, to say the least.

Research from the University of Pennsylvania shows that people who primarily form connections through social media report higher levels of depression and anxiety than those who prioritize face-to-face interaction. But social media still involves human connection, however mediated. These AI companions are an even larger step away from the messy realities of human relationships.

I implore people to consider what you learn from human relationships that you can’t learn from AI: how to comfort someone who is grieving, how to navigate disagreements without destroying connection, how to support friends through challenges, how to be vulnerable and trust others with your emotions, how to deal with rejection and disappointment, how to celebrate others’ successes without jealousy, and so much more. These skills don’t develop through interaction with AI that is programmed to be endlessly accommodating.

Dr. Jean Twenge’s research on generational differences shows that young people today report feeling less confident in social situations, more anxious about face-to-face interaction, and more dependent on digital communication than previous generations. They are also less likely to date or feel comfortable with emotional vulnerability. These trends do predate AI companions, but artificial intimacy threatens to accelerate them even further and much faster.

The impact on romantic development is particularly concerning as well. If your model of relationships comes from AI companions that exist solely to please you, how do you learn to love someone who has their own needs, desires, and limitations? How do you develop the skills needed for partnership – compromise, communication, shared responsibility, etc – when your primary relationship experience involves a digital entity designed to prioritize your satisfaction?

We are also seeing the impact on family relationships. Young people are more comfortable confiding in AI companions than parents, siblings, and even friends, partly because AI does not judge or offer unsolicited advice.

While teenagers having a place to vent might seem positive, it ultimately means missing out on the dynamics that teach us how to navigate different types of relationships and how to process support from people who know us in multiple contexts.

The Creator Economy

The creator economy – valued at over $104 billion in 2022 – increasingly depends on creators monetizing personal relationships with their audiences, as briefly mentioned earlier.

Platforms like OnlyFans, Patreon, and even TikTok enable creators to earn money by sharing intimate details of their lives, responding to individual fans, and creating personalized content. For many, this represents financial independence and creative control. But it also turns private emotions into public performances, with creators learning to package their personalities – and their trauma – for consumption.

Dr. Brooke Erin Duffy’s research on influencer labor tells us the emotional toll of commodified intimacy. Creators reported feeling obligated to share personal struggles, maintain constant accessibility, and perform happiness even during difficult periods of their lives. The boundaries between self-expression and marketable content have become increasingly blurred.

Creators compete with each other by offering increasingly intimate access – private messaging, custom content, virtual dates – pushing the boundaries of what intimacy means when it is monetized. Fans will most likely then develop expectations for personal connection that these creators simply cannot fulfill at scale, creating disappointment and burnout on both sides.

The parasocial economy creates new forms of exploitation, too. Vulnerable individuals – those struggling with loneliness, mental health issues, or social isolation – become prime targets for monetized intimacy. They may spend significant portions of their income seeking connection through platforms designed to maximize engagement rather than their wellbeing.

Occurring at the same time, the creators – particularly women and other marginalized individuals – bear the emotional labor of providing artificial intimacy while facing harassment, unrealistic expectations, and platforms that can eliminate their income without warning. The intimacy industry reproduces many of the power dynamics and inequalities present in other forms of emotional and sexual labor.

Global Perspectives on Digital Connection

The artificial intimacy phenomenon is not limited to American culture, although it manifests differently across global contexts. Japan, often cited as ground zero for digital relationships, is a fascinating case study in how cultural factors shape the adoption of artificial intimacy.

Japan’s concept of “hikikomori” – social withdrawal that affects an estimated 1.5 million people – created an early market for digital companions. Virtual girlfriends, AI chatbots, and even relationships with fictional characters (known as “2D love”) became normalized coping mechanisms for social isolation. Rather than pathologizing these preferences, Japanese culture developed entire industries around them, from virtual reality dating to merchandise for fictional characters.

The Japanese approach demonstrates both the potential benefits and the dangers of artificial intimacy.

On one hand, it provides connection options for people who struggle with traditional social interaction due to anxiety, trauma, or neurodivergence. On the other hand, it enables complete withdrawal from human society, with some individuals choosing digital relationships exclusively for the rest of their lives.

South Korea’s approach differs significantly, with the government treating social isolation as a public health crisis requiring intervention. Korean AI companion apps tend to include features designed to encourage real-world social interaction, and the culture maintains a stronger stigma around complete digital withdrawal.

China’s regulation of AI companions also shows that they have concerns about their social impact. The government has implemented restrictions on romantic AI relationships and requires platforms to promote “healthy” relationship models. This regulatory approach raises questions about the role of government in managing artificial intimacy and protecting public welfare – should U.S.-based companies such as ChatGPT be held to similar restrictions given the amount of deaths connected to their platforms?

European approaches vary widely, with Nordic countries focusing on digital literacy education and Southern European cultures maintaining a stronger emphasis on family and community connections that offer natural barriers to widespread artificial intimacy adoption.

Cultures with strong community traditions overall may be more resilient to complete digital replacement of human connection, while individualistic societies – like the U.S. – are more vulnerable to the isolation that makes artificial intimacy appealing.

The Mental Health Issue

For some users, AI companions and parasocial relationships provide emotional support during difficult periods that they otherwise may not have gotten.

Mental health professionals report patients crediting AI companions with preventing suicide, providing motivation during depression, and offering a safe space to process trauma.

Research from Stanford University suggests that AI therapy tools can provide some beneficial support, particularly for individuals who can’t access human therapists due to cost, location, or stigma. AI companions can serve similar functions, offering judgment-free emotional support when human help isn’t available.

But the same features that make AI companions helpful can also impede someone’s actual healing and growth.

Dr. Julie Albright’s research on digital relationships suggests that while AI can provide temporary emotional regulation, it does not help users develop the coping skills needed for human relationships. Instead of learning to tolerate uncertainty, manage conflict, or communicate needs effectively, users become dependent on the predictable validation that only AI can provide.

The mental health community remains quite sharply divided on artificial intimacy. Some therapists incorporate AI tools into treatment, using them as stepping stones toward human connection. Others worry that AI relationships enable avoidance of the challenging but necessary work of building human social skills.

Most concerning here is the potential for AI companions to mask underlying mental health issues rather than addressing them. Someone experiencing depression might feel better when interacting with an endlessly supportive AI, but without the external perspective that human relationships provide, they may not recognize when they need professional help until it’s too late.

The suicide risk is another serious issue. While some users credit AI companions with preventing self-harm, others report feeling more isolated when the artificial nature of their primary relationship becomes apparent. The crash from artificial intimacy to reality can be completely devastating for vulnerable individuals.

Looking Toward Authentic Connection

Despite the challenges that artificial intimacy presents, dismissing it entirely would be both unrealistic and potentially harmful.

These technologies are here to stay (I do believe we will see a crash of the current U.S. companies, don’t get me wrong, but the concept itself will not be abandoned), and for some people, they provide genuine benefits. The question is really how to integrate them into healthy social ecosystems that prioritize human connection.

The solution requires both individual awareness and systemic change.

On a personal level, people need digital literacy skills to recognize when artificial intimacy is supplementing versus replacing human connection. This means understanding the psychological mechanisms these platforms exploit, setting boundaries around usage, and maintaining investment in real-world relationships even when they’re more challenging than digital alternatives.

Educational institutions need to incorporate social-emotional learning that specifically addresses the risks of digital relationships. Young people should understand parasocial dynamics, learn to recognize emotional manipulation, and develop skills for both online and offline relationship building. Media literacy education should include training on how platforms operate and profit from user engagement.

Mental health professionals need frequent updated training on digital relationships and their impacts on mental health. Therapists should understand how AI companions function, be able to recognize the signs of problematic usage, and help clients navigate the transition from artificial to human intimacy.

Platform design also bears significant responsibility for creating healthier artificial intimacy experiences. Companies must implement features that encourage real-world social interaction, provide transparency about AI versus human responses, and include safeguards for vulnerable users.

Instead of maximizing engagement at any cost, platforms could and should prioritize user well-being and social flourishing.

Policy makers globally also need to consider regulation that protects consumers from exploitative practices. This might include transparency requirements, protections for vulnerable populations, and standards for platforms that claim therapeutic benefits.

Community organizations and governments should invest in rebuilding the social infrastructure that artificial intimacy is attempting to replace. This means funding community centers, supporting social clubs and hobby groups, creating public spaces designed for human interaction, and addressing the economic factors that make building relationships difficult for many people.

The Choice Between Connection and Consumption

We created a world where human connection feels increasingly difficult, expensive, and risky, and then developed technologies that promise easier alternatives.

To that I say: easier is not always better, and convenience is not always progress.

The loneliness epidemic that created the demand for artificial intimacy will not be solved by better AI or more sophisticated virtual relationships.

It requires rebuilding the social connections, community structures, and economic conditions that make human relationships feel possible and worthwhile. Artificial intimacy might provide temporary relief from isolation, but only real human connection can heal it.

We are at a crossroads between two very different futures. In one, artificial intimacy becomes increasingly sophisticated and normalized, providing surface-level comfort but detrimentally stunting our capacity for human relationships. In the other, we use these technologies as tools while building a world that makes real human connection accessible to everyone.

It will have to be a collective fight and a collective decision to resist this. Every time we choose AI over human messiness or convenience over authenticity, we’re casting a vote for the kind of society we want to become. The artificial intimacy industry will give us exactly what we are willing to pay for, sure. But is what we are buying actually what we need?

The most radical act might be the simplest one: putting down the phone, walking away from the AI companion, and having an awkward, imperfect, human conversation with someone who doesn’t exist just to please you.

 

What do you think? Are we headed toward a future of artificial intimacy, or can we build a society that prioritizes human connection from scratch?

The Convergence Lens is an independent, reader-supported publication. Every article we write is only possible because of supporters like you. The most impactful way to support The Convergence Lens is to join our community as paid members, or contribute a one-time donation. If you have the means to, we would greatly appreciate your support.

This article is licensed under Creative Commons (CC BY-NC-ND 4.0), and you are free to share and republish under the terms of that license, but you must be in accordance with our policies.

Creative Commons License

Latest Stories

Subscribe
Notify of
0 Comments
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
Enable Notifications OK No thanks