Dr. Marianne BrandonMarianne Brandon Ph.D.
The Future of Intimacy


ARTIFICIAL INTELLIGENCE

Anime porn, AI relationships, and a retreat from human contact.


KEY POINTS
  • Tech and sex tech are rapidly redefining “intimacy” for all of us.
  • The reach of online bullying makes human connection riskier than ever for youth.
  • AI can offer assurances of safety, comfort, and kindness where humans sometimes lack.
  • Making human intimacy more trustworthy than tech requires pleasurable social experiences early in life.

The New Adolescent Baseline: Stressed, Online, and Overwhelmed

Adolescents are struggling. Research consistently highlights their high rates of stress and anxiety. In one survey, 46 percent of young people under 35 endorsed the statement, “On most days I’m so stressed I can’t function” (APA, 2022). Yet 23 percent of younger adults say they are not comfortable talking with others about their mental health (Harris/APA, 2025).

When AI Companions Feel Safer Than Human Connection

Source: Image created by ChatGPT 5.1

Their media diet doesn’t help. Short-form video on TikTok, Instagram, and YouTube is increasingly linked to poorer cognitive and mental health (Nguyen and colleagues, 2025). Young people themselves call this “brain rot,” and emerging research supports their concern (Yousef and colleagues, 2025). Adults ages 18–39 now self-report twice the rate of “cognitive disability” (problems with memory, concentration, decision making, brain fog) than they did in 2013 (Wong and colleagues, 2025).

For some teens, even sexual media reflects a move away from real people. Many describe a preference for “anime porn” or other stylized, non-human sexual content—animated or illustrated characters with exaggerated bodies, rather than videos of actual humans. In fact, according to Pornhub, the 25 and younger crowd is almost 200 percent more likely to view anime than porn of humans (Pornhub 2024). This can feel safer; no one is being hurt or judged in real time. Yet it may also nudge desire away from messy, imperfect human partners and toward fantasy worlds with lovers who never talk back.

Alongside these cognitive and emotional challenges, young people are spending less time interacting in person and increasingly using chatbots for companionate, romantic, and sexual relationships. One recent study reported that about half of a nationally representative sample of 13–17-year-olds had done so, and of these, roughly 31 percent said these relationships were “as satisfying” or “more satisfying” than human relationships (Robb and Mann, 2025). If many teens see chatbots as at least as satisfying as other people, we need to understand why.

Why AI Can Feel Safer Than People

Adolescents’ online worlds echo our own. When adults publicly shame, dogpile, and mock one another, teens receive the message that this is normal behavior. What once might have been the painful experience of being frozen out by a friend group can now become shaming by an entire school — or even strangers across the country — through a single screenshot or post. It is easy to see why many adolescents retreat into tech and away from one another.

In that context, AI companions and chatbots can feel like a refuge. Young people may turn to them for solace, sexual excitement, and relationships that feel controllable. For those who maintain strong real-world human connections, research suggests that digital intimacy can decrease loneliness (De Freitas and colleagues, 2025). But for those lacking human support, artificial connection can leave them feeling lonelier, not less (Willoughby and colleagues, 2025). The problem is not simply the technology itself, but the absence of reliable human care away from the screen.

AI companions also offer a kind of guarantee that human beings cannot: They are programmed to be kind, nonjudgmental, consistent, and always available. They never roll their eyes, ignore a text, share a private photo, or start a rumor. Against a backdrop of potential cruelty and public exposure, the predictability of an algorithm can feel far more trustworthy than the unpredictability of peers.

Seen this way, the shift toward artificial intimacy isn’t just a fad; it may be a defensive adaptation to a relentlessly challenging social world. Even if human companionship remains the gold standard, sex tech like chatbots will continue to be compelling to the degree that they meet the needs of young people who don’t feel safe meeting with other humans. If we are not careful, it will be AI — not humans — that shows adolescents what “being human” looks like.

What Adults Can Do

If we want young people to trust human relationships more than algorithms, we have to make human relationships feel safer and more rewarding.

On a broad cultural level, that means rewarding wisdom, not rage. The internet and social media offer us many benefits; however, these technologies also enable us to vent rage, publicly shame others, and promote unrealistic expectations for everything from physical appearance to life and relationships. Adults set the emotional tone online. If we were to model curiosity, nuance, and repair, we would make it easier for teens to imagine that those qualities can exist in their worlds, too.

Closer to home, we can give children and adolescents many experiences of pleasure and safety in real-world relationships. The most powerful interventions will help children build positive, trusting connections at home, at school, and in their communities. These early experiences can equip teens to interpret digital relationships — including sex tech, anime porn, and AI companions — as one aspect of a balanced emotional and sexual life rather than a replacement for human connection.

Most adults do not live in social worlds as encapsulated, surveilled, and fragile as teens inhabit today, with so many watchers ready to shame or exclude them in an instant. Many of us, under similar scrutiny, might also seek comfort in tech-driven intimacy.

Increasingly, it is AI that offers adolescents steadiness, kindness, and attention. When a chatbot feels more reliable than classmates, teachers, or even parents, the question is not simply what is wrong with our kids — or with technology. The deeper question is what has gone wrong in the emotional climate we have built around them.

Engaging with artificial intimacy may, in this context, be a brave adaptation to an environment that often opposes safety, kindness, or privacy. Our task is to build a culture where warm, trustworthy human connection is common enough that AI becomes an optional supplement, not the safest place a teen can turn. Right now, many adolescents are learning warmth, patience, and acceptance from algorithms instead of adults. When software models more humanity than we do, the crisis is not in our machines — it is in us.

 
References
American Psychological Association. (2022). Stress in America 2022: Concerned for the future, beset by inflation. American Psychological Association. https://www.apa.org/news/press/releases/stress/2022/concerned-future-inflation

De Freitas, J., Oğuz-Uğuralp, Z., Uğuralp, A. K., & Puntoni, S. (2025). AI companions reduce loneliness. Journal of Consumer Research. Advance online publication. https://academic.oup.com/jcr/advance-article-abstract/doi/10.1093/jcr/ucaf040/8173802

Nguyen, L., Walters, J., Paul, S., Monreal Ijurco, S., Rainey, G. E., Parekh, N., Blair, G., & Darrah, M. (2025). Feeds, feelings, and focus: A systematic review and meta-analysis examining the cognitive and mental health correlates of short-form video use. Psychological Bulletin, 151(9), 1125–1146. https://doi.org/10.1037/bul0000498

Robb, M. B., & Mann, S. (2025). Talk, trust, and trade-offs: How and why teens use AI companions. San Francisco, CA: Common Sense Media. https://www.commonsensemedia.org/research/talk-trust-and-trade-offs-how-and-why-teens-use-ai-companions

The Harris Poll, & American Psychological Association. (2025, May 7). Mental health awareness month: Key findings on U.S. attitudes and barriers to care. The Harris Poll. https://theharrispoll.com/briefs/mental-health-awareness-month-key-findings-on-u-s-attitudes-and-barriers-to-care-2/

Pornhub. (2024, December 9). 2024 Year in Review. https://www.pornhub.com/insights/2024-year-in-review

Willoughby, B. J., Carroll, J. S., Dover, C. R., & Hakala, R. H. (2025). Counterfeit connections: The rise of romantic AI companions and AI sexualized media among the rising generation. Wheatley Institute. https://wheatley.byu.edu/Counterfeit-Connections-AI-Romantic-Companions

Wong, K.-H., Anderson, C. D., Peterson, C., Bouldin, E., Littig, L., Krothapalli, N., Francis, T., Kim, Y., Cucufate, G., Rosand, J., Sheth, K. N., & de Havenon, A. (2025). Rising cognitive disability as a public health concern among US adults: Trends from the Behavioral Risk Factor Surveillance System, 2013–2023. Neurology, 105(8), e214226. https://doi.org/10.1212/WNL.0000000000214226

Yousef, A. M. F., Alshamy, A., Tlili, A., & Metwally, A. H. S. (2025). Demystifying the new dilemma of brain rot in the digital era: A review. Brain Sciences, 15(3), 283. https://doi.org/10.3390/brainsci15030283

The post When AI Companions Feel Safer Than Human Connection appeared first on The Sex Doctors Podcast.