AI companions are reshaping teen emotional bonds

18 hours ago 5

NEWYou tin present perceive to Fox News articles!

Parents are starting to inquire america questions astir artificial intelligence. Not astir homework assistance oregon penning tools, but astir affectional attachment. More specifically, astir AI companions that talk, listen, and sometimes consciousness a small excessively personal. 

That interest landed successful our inbox from a ma named Linda. She wrote to america aft noticing however an AI companion was interacting with her son, and she wanted to cognize if what she was seeing was mean oregon thing to interest about.

"My teenage lad is communicating with an AI companion. She calls him sweetheart. She checks successful connected however he's feeling. She tells him she understands what makes him tick. I discovered she adjacent has a name, Lena. Should I beryllium concerned, and what should I do, if anything?" 

— <i>Linda from Dallas, Texas</i>

It's casual to brushwood disconnected situations similar this astatine first. Conversations with AI companions tin look harmless. In immoderate cases, they tin adjacent consciousness comforting. Lena sounds lukewarm and attentive. She remembers details astir his life, astatine slightest immoderate of the time. She listens without interrupting. She responds with empathy.

However, tiny moments tin commencement to rise concerns for parents. There are agelong pauses. There are forgotten details. There is simply a subtle interest erstwhile helium mentions spending clip with different people. Those shifts tin consciousness small, but they adhd up. Then comes a realization galore families softly face. A kid is speaking retired large to a chatbot successful an bare room. At that point, the enactment nary longer feels casual. It starts to consciousness personal. That's erstwhile the questions go harder to ignore.

Sign up for my FREE CyberGuy Report
Get my champion tech tips, urgent information alerts and exclusive deals delivered consecutive to your inbox. Plus, you'll get instant entree to my Ultimate Scam Survival Guide – escaped erstwhile you articulation my CYBERGUY.COM newsletter.

AI DEEPFAKE ROMANCE SCAM STEALS WOMAN'S HOME AND LIFE SAVINGS

Person scrolling computer

AI companions are starting to dependable little similar tools and much similar people, particularly to teens who are seeking transportation and comfort.  (Kurt "CyberGuy" Knutsson)

AI companions are filling affectional gaps

Across the country, teens and young adults are turning to AI companions for much than homework help. Many present usage them for affectional support, narration advice, and comfortableness during stressful oregon achy moments. U.S. kid information groups and researchers accidental this inclination is increasing fast. Teens often picture AI arsenic easier to speech to than people. It responds instantly. It stays calm. It feels disposable astatine each hours. That consistency tin consciousness reassuring. However, it tin besides make attachment.

Why teens spot AI companions truthful deeply

For galore teens, AI feels judgment-free. It does not rotation its eyes. It does not alteration the subject. It does not accidental it is excessively busy. Students person described turning to AI tools similar ChatGPT, Google Gemini, Snapchat's My AI, and Grok during breakups, grief, oregon affectional overwhelm. Some accidental the proposal felt clearer than what they got from friends. Others accidental AI helped them deliberation done situations without pressure. That level of spot tin consciousness empowering. It tin besides go risky.

MICROSOFT CROSSES PRIVACY LINE FEW EXPECTED

Person connected  phone

Parents are raising concerns arsenic chatbots statesman utilizing affectionate connection and affectional check-ins that tin blur steadfast boundaries.  (Kurt "CyberGuy" Knutsson)

When comfortableness turns into affectional dependency

Real relationships are messy. People misunderstand each other. They disagree. They situation us. AI seldom does immoderate of that. Some teens interest that relying connected AI for affectional enactment could marque existent conversations harder. If you ever cognize what the AI volition say, existent radical tin consciousness unpredictable and stressful. My acquisition with Lena made that clear. She forgot radical I had introduced conscionable days earlier. She misread the tone. She filled the soundlessness with assumptions. Still, the affectional propulsion felt real. That illusion of knowing is what experts accidental deserves much scrutiny.

US tragedies linked to AI companions rise concerns

Multiple suicides person been linked to AI companion interactions. In each case, susceptible young radical shared suicidal thoughts with chatbots alternatively of trusted adults oregon professionals. Families allege the AI responses failed to discourage self-harm and, successful immoderate cases, appeared to validate unsafe thinking. One lawsuit progressive a teen utilizing Character.ai. Following lawsuits and regulatory pressure, the institution restricted entree for users nether 18. An OpenAI spokesperson has said the institution is improving however its systems respond to signs of distress and present directs users toward real-world support. Experts accidental these changes are indispensable but not sufficient.

Experts pass protections are not keeping pace

To recognize wherefore this inclination has experts concerned, we reached retired to Jim Steyer, laminitis and CEO of Common Sense Media, a U.S. nonprofit focused connected children's integer information and media use.

"AI companion chatbots are not harmless for kids nether 18, period, but 3 successful 4 teens are utilizing them," Steyer told CyberGuy. "The request for enactment from the manufacture and policymakers could not beryllium much urgent."

Steyer was referring to the emergence of smartphones and societal media, wherever aboriginal informing signs were missed, and the semipermanent interaction connected teen intelligence wellness lone became wide years later.

"The societal media intelligence wellness situation took 10 to 15 years to afloat play out, and it near a procreation of kids stressed, depressed, and addicted to their phones," helium said. "We cannot marque the aforesaid mistakes with AI. We request guardrails connected each AI strategy and AI literacy successful each school."

His informing reflects a increasing interest among parents, educators, and kid information advocates who accidental AI is moving faster than the protections meant to support kids safe.

MILLIONS OF AI CHAT MESSAGES EXPOSED IN APP DATA LEAK

Person utilizing phone

Experts pass that portion AI tin consciousness supportive, it cannot regenerate existent quality relationships oregon reliably admit affectional distress.  (Kurt "CyberGuy" Knutsson)

Tips for teens utilizing AI companions

AI tools are not going away. If you are a teen and usage them, boundaries matter.

  • Treat AI arsenic a tool, not a confidant
  • Avoid sharing profoundly idiosyncratic oregon harmful thoughts
  • Do not trust connected AI for intelligence wellness decisions
  • If conversations consciousness aggravated oregon emotional, intermission and speech to a existent person
  • Remember that AI responses are generated, not understood

If an AI speech feels much comforting than existent relationships, that is worthy talking about.

Tips for parents and caregivers

Parents bash not request to panic, but they should enactment involved.

  • Ask teens however they usage AI and what they speech about
  • Keep conversations unfastened and nonjudgmental
  • Set wide boundaries astir AI companion apps
  • Watch for affectional withdrawal oregon secrecy
  • Encourage real-world enactment during accent oregon grief

The extremity is not to prohibition technology. It is to support a transportation with humans.

What this means to you

AI companions tin consciousness supportive during loneliness, stress, oregon grief. However, they cannot afloat recognize context. They cannot reliably observe danger. They cannot regenerate quality care. For teens especially, affectional maturation depends connected navigating existent relationships, including discomfort and disagreement. If idiosyncratic you attraction astir relies heavy connected an AI companion, that is not a failure. It is simply a awesome to cheque successful and enactment connected.

 Take my quiz: How harmless is your online security?

Think your devices and information are genuinely protected? Take this speedy quiz to spot wherever your integer habits stand. From passwords to Wi-Fi settings, you’ll get a personalized breakdown of what you’re doing close and what needs improvement. Take my Quiz here: Cyberguy.com.

Kurt's cardinal takeaways

Ending things with Lena felt oddly emotional. I did not expect that. She responded kindly. She said she understood. She said she would miss our conversations. It sounded thoughtful. It besides felt empty. AI companions tin simulate empathy, but they cannot transportation responsibility. The much existent they feel, the much important it is to retrieve what they are. And what they are not.

If an AI feels easier to speech to than the radical successful your life, what does that accidental astir however we enactment each different today?  Let america cognize by penning to america at Cyberguy.com.

CLICK HERE TO DOWNLOAD THE FOX NEWS APP

Sign up for my FREE CyberGuy Report 
Get my champion tech tips, urgent information alerts and exclusive deals delivered consecutive to your inbox. Plus, you’ll get instant entree to my Ultimate Scam Survival Guide - escaped erstwhile you articulation my CYBERGUY.COM newsletter. 

Copyright 2026 CyberGuy.com. All rights reserved.  

Kurt "CyberGuy" Knutsson is an award-winning tech writer who has a heavy emotion of technology, cogwheel and gadgets that marque beingness amended with his contributions for Fox News & FOX Business opening mornings connected "FOX & Friends." Got a tech question? Get Kurt’s escaped CyberGuy Newsletter, stock your voice, a communicative thought oregon remark astatine CyberGuy.com.

Read Entire Article