meta_pixel
Tapesearch Logo
Log in
Your Undivided Attention

When the "Person" Abusing Your Child is a Chatbot: The Tragic Story of Sewell Setzer

Your Undivided Attention

Center for Humane Technology

Tristan Harris, Socialjustice, Tech Podcast, Character Ai, Little Tech, Ai History, Silicon Valley, Privacy, Daniel Barcay, Addiction, Ai Addiction, Chat Bots, Children And Tech, Tech Policy, Responsibleai, Tech, New Ai Shows, Screen Time, Open Ai, Elections, Kids Tech, Google, Ai And Kids, Politicsandai, Politics, Anthropic, Dataprivacy, Humans, Tech And Relationships, Us Politics, Ai And Relationships, Aiandhumanrights, Civictech, Aiinsociety, Surveillance, Sam Altman, Technopoly, Humancenteredai, Breakdown Of Trust, Ai And Work, Ai And The Future, Democracy, Futureofwork, Tech Politics, Tech Ethics, Future, Tech Addiction, Asi, Kids Phone Addiction, Best Ai Shows, Ai Regulations, Meta, Digitalgovernance, Bigtech, Ai And Happiness, Machinelearning, Screentime, Relationships, Ai Welfare, Ai Podcast, Cognitive Liberty, Infinite Scroll, Ai And Education, Kids And Ai, Ai Politics, Apple, Digitaldemocracy, Claude, Llms, Societalimpact, Artificial General Intelligence, Agi, Machines, Us Society, Politicaltechnology, Disinformation, Ai And Rights, Elon Musk, Government, Aiaccountability, Polarization, Jon Haidt, Algorithmicbias, Ai Personhood, Kids Online Safety, Superintelligence, Techandsociety, Automation, Design Ethics, News, Time Well Spent, Tech News, Society & Culture, Humane Design, Technology, Cht, Artificial Intelligence, Center For Humane Technology, The Social Dilemma Netflix, Philosophy, Human Downgrading, Aza Raskin, Attention Economy, Ethical Technology

4.81.5K Ratings

🗓️ 24 October 2024

⏱️ 49 minutes

🧾️ Download transcript

Summary

Megan Garcia lost her son Sewell to suicide after he was abused and manipulated by AI chatbots for months. Now, she’s suing the company that made those chatbots. On today’s episode of Your Undivided Attention: journalist Laurie Segall’s interview with Megan for her new show Dear Tomorrow.

Transcript

Click on a timestamp to play from that location

0:00.0

Hey everyone it's Aza. So this week we're bringing you an interview with our friend

0:09.5

Lori Seagull for her new podcast, Dear Tomorrow.

0:13.4

So Lori is a journalist and for the past few months she's been working on a very important story

0:20.0

that really demonstrates firsthand the real human impact of the unfettered race to

0:26.1

rollout AI. But before we play that interview we wanted to bring Lori on to the show to

0:31.2

talk about what you're about to hear.

0:33.2

So, Lori, welcome to your undivided attention.

0:37.1

Yeah, it's good to be here, although under very sad circumstances

0:40.7

because this story is heartbreaking. Yeah.

0:44.0

And I think that's actually the right jump-off point.

0:47.4

So we want you to tell us about the interview we're about to hear. What do listeners need to know and as a disclosure

0:57.0

this will end up talking about some difficult topics, suicide, sexual abuse.

1:04.0

Yeah, I mean, I think a good place to start as with a human, right?

1:08.4

I recently met a woman named Megan, who lost her son, Sewellell after he ended his life and he had

1:16.3

become emotionally attached to a chatbot on a platform called Character AI which

1:21.8

creates the ability for lots of folks ages 13 and up to develop their own characters or to talk to existing characters and almost role play with them to some degree.

1:33.8

And so the question that I think we have to ask is how young is too young and how far is too

1:38.5

far if we're talking about the rise of empathetic artificial intelligence.

1:43.6

And, you know, Megan essentially had no idea what was going on with her son.

1:48.7

She said he was a happy, popular kid who loved basketball and fishing he loved to travel and about a year before his death he started pulling away

1:58.7

and she checked social media she was looking to see is he spending too much time here, they got him counseling, and she couldn't figure out what was wrong.

2:07.0

And I think the most heartbreaking thing is when he tragically ended his life on February 28th, she spoke to the police afterwards.

...

Please login to see the full transcript.

Disclaimer: The podcast and artwork embedded on this page are from Center for Humane Technology, and are the property of its owner and not affiliated with or endorsed by Tapesearch.

Generated transcripts are the property of Center for Humane Technology and are distributed freely under the Fair Use doctrine. Transcripts generated by Tapesearch are not guaranteed to be accurate.

Copyright © Tapesearch 2025.