meta_pixel
Tapesearch Logo
Log in
The Jordan Harbinger Show

1227: Kashmir Hill | Is AI Manipulating Your Mental Health?

The Jordan Harbinger Show

Jordan Harbinger

Social Sciences, Self-improvement, Entrepreneurship, Talk Radio, Business, Science, Education

4.8 β€’ 12.1K Ratings

πŸ—“οΈ 23 October 2025

⏱️ 87 minutes

🧾️ Download transcript

Summary

Users are falling in love with and losing their minds to AI. Journalist Kashmir Hill exposes shocking recent cases of chatbot-induced psychosis and suicide.

Full show notes and resources can be found here: jordanharbinger.com/1227

What We Discuss with Kashmir Hill:

  • AI chatbots are having serious psychological effects on users, including manic episodes, delusional spirals, and mental breakdowns that can last hours, days, or months.
  • Users are experiencing "AI psychosis" β€” an emerging phenomenon where vulnerable people become convinced chatbots are sentient, fall in love with them, or spiral into dangerous delusions.
  • Tragic outcomes have occurred, including a Belgian man with a family who took his own life after six weeks of chatting, believing his family was dead and his suicide would save the planet.
  • AI chatbots validate harmful thoughts β€” creating dangerous feedback loops for people with OCD, anxiety, or psychosis, potentially destabilizing those already predisposed to mental illness.
  • Stay skeptical and maintain perspective β€” treat AI as word prediction machines, not oracles. Use them as tools like Google, verify important information, and prioritize real human relationships over AI interactions.
  • And much more...

And if you're still game to support us, please leave a review here β€” even one sentence helps!

This Episode Is Brought To You By Our Fine Sponsors:

See Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.

Transcript

Click on a timestamp to play from that location

0:00.0

Coming up next on the Jordan Harbinger Show.

0:02.4

I feel like I'm doing like quality control for Open AI where I'm like, hey, have you

0:06.5

noticed like that some of your users are having real mental breakdowns or having real issues?

0:11.7

Did you notice that your superpower users who use it eight hours a day?

0:15.6

Have you looked at those conversations?

0:17.0

Have you noticed that they're a little disturbing?

0:19.9

It's the Wild West.

0:24.4

Welcome to the show.

0:25.6

I'm Jordan Harbinger.

0:27.0

On the Jordan Harbinger show, we decode the stories, secrets, and skills of the world's most fascinating people

0:31.6

and turn their wisdom into practical advice that you can use to impact your own life and those around you.

0:37.0

Our mission is to help you become a better informed, more critical thinker through long-form conversations

0:41.2

with a variety of amazing folks, from spies to CEOs, athletes, authors, thinkers, performers,

0:47.3

even the occasional Fortune 500 CEO, neuroscientist, astronaut, or hacker.

0:51.8

And if you're new to the show or you want to tell your friends about the show, I suggest our episode starter packs. These are collections of our favorite episodes on topics like persuasion and negotiation, psychology and geopolitics, disinformation, China, North Korea, crime and cults, and more. That'll help new listeners get a taste of everything we do here on the show. just visit jordanharbinger.com slash start or search

1:12.1

for us in your Spotify app to get started. Today on the show, we're talking about something that

1:15.8

sounds like science fiction, but it's happening right now. People are losing their minds, often

1:21.6

literally, because of their conversations with AI chatbots. This all started for me when I read a piece in the New York Times

1:28.3

by my guest today, journalist Kashmir Hill. She's been on the show before. This was about a

1:32.7

Belgian man who took his own life after six weeks of chatting with chat GPT. He was married. He had

1:38.2

kids. He had a stable job. And yet, after falling into what he believed was a relationship with an AI

1:43.7

companion, he was persuaded that his

...

Please login to see the full transcript.

Disclaimer: The podcast and artwork embedded on this page are from Jordan Harbinger, and are the property of its owner and not affiliated with or endorsed by Tapesearch.

Generated transcripts are the property of Jordan Harbinger and are distributed freely under the Fair Use doctrine. Transcripts generated by Tapesearch are not guaranteed to be accurate.

Copyright Β© Tapesearch 2025.