meta_pixel
Tapesearch Logo
Log in
Your Undivided Attention

The Man Who Predicted the Downfall of Thinking

Your Undivided Attention

Center for Humane Technology

Tristan Harris, Socialjustice, Tech Podcast, Character Ai, Little Tech, Ai History, Silicon Valley, Privacy, Daniel Barcay, Addiction, Ai Addiction, Chat Bots, Children And Tech, Tech Policy, Responsibleai, Tech, New Ai Shows, Screen Time, Open Ai, Elections, Kids Tech, Google, Ai And Kids, Politicsandai, Politics, Anthropic, Dataprivacy, Humans, Tech And Relationships, Us Politics, Ai And Relationships, Aiandhumanrights, Civictech, Aiinsociety, Surveillance, Sam Altman, Technopoly, Humancenteredai, Breakdown Of Trust, Ai And Work, Ai And The Future, Democracy, Futureofwork, Tech Politics, Tech Ethics, Future, Tech Addiction, Asi, Kids Phone Addiction, Best Ai Shows, Ai Regulations, Meta, Digitalgovernance, Bigtech, Ai And Happiness, Machinelearning, Screentime, Relationships, Ai Welfare, Ai Podcast, Cognitive Liberty, Infinite Scroll, Ai And Education, Kids And Ai, Ai Politics, Apple, Digitaldemocracy, Claude, Llms, Societalimpact, Artificial General Intelligence, Agi, Machines, Us Society, Politicaltechnology, Disinformation, Ai And Rights, Elon Musk, Government, Aiaccountability, Polarization, Jon Haidt, Algorithmicbias, Ai Personhood, Kids Online Safety, Superintelligence, Techandsociety, Automation, Design Ethics, News, Time Well Spent, Tech News, Society & Culture, Humane Design, Technology, Cht, Artificial Intelligence, Center For Humane Technology, The Social Dilemma Netflix, Philosophy, Human Downgrading, Aza Raskin, Attention Economy, Ethical Technology

4.81.5K Ratings

🗓️ 6 March 2025

⏱️ 59 minutes

🧾️ Download transcript

Summary

Forty years ago, Neil Postman warned that “we are a people on the verge of amusing ourselves to death.” He was writing about TV, but his insights feel eerily prophetic in our age of smartphones, social media, and AI. In this episode, we explore Postman’s ideas and what they can teach us.

Transcript

Click on a timestamp to play from that location

0:00.0

Hey everyone, it's Tristan, and welcome to your undivided attention.

0:08.4

The great late media theorist Neil Postman liked to quote Aldous Huxley, who once said

0:13.1

that people will come to adore the technologies that undo their capacity to think.

0:21.0

He was mostly talking about television.

0:23.3

This was before the internet or personal computers

0:25.2

ended up in our homes or rewider to societies.

0:28.2

But Postman could have just as easily

0:29.8

been talking about smartphones, social media, and AI.

0:33.5

And for all the ways television has transformed us

0:35.7

in our politics, our eating habits, our critical

0:38.1

thinking skills, it's nothing compared to the way that today's technologies are restructuring

0:43.4

what human relationships are, what communication is, or how people know what they know.

0:48.9

As Postman pointed out many times, it's hard to understand how the technology and media we use

0:54.0

is changing us when we're in the thick of it. And to understand how the technology and media we use is changing us when

0:55.3

we're in the thick of it.

0:56.8

And so now, as the coming wave of AI is about to flood us with new technologies and new

1:01.6

media forms, it's never been more important to have critical tools to ask of technology's

1:07.2

influence on our society.

1:09.2

And Postman had seven core questions

1:11.4

that we can and should ask of any new technology.

1:14.5

And I'll let him tell you in his own words.

1:17.1

What is the problem to which a technology claims

...

Please login to see the full transcript.

Disclaimer: The podcast and artwork embedded on this page are from Center for Humane Technology, and are the property of its owner and not affiliated with or endorsed by Tapesearch.

Generated transcripts are the property of Center for Humane Technology and are distributed freely under the Fair Use doctrine. Transcripts generated by Tapesearch are not guaranteed to be accurate.

Copyright © Tapesearch 2025.