meta_pixel
Tapesearch Logo
Log in
Your Undivided Attention

Weaponizing Uncertainty: How Tech is Recycling Big Tobacco’s Playbook

Your Undivided Attention

Center for Humane Technology

Tristan Harris, Socialjustice, Tech Podcast, Character Ai, Little Tech, Ai History, Silicon Valley, Privacy, Daniel Barcay, Addiction, Ai Addiction, Chat Bots, Children And Tech, Tech Policy, Responsibleai, Tech, New Ai Shows, Screen Time, Open Ai, Elections, Kids Tech, Google, Ai And Kids, Politicsandai, Politics, Anthropic, Dataprivacy, Humans, Tech And Relationships, Us Politics, Ai And Relationships, Aiandhumanrights, Civictech, Aiinsociety, Surveillance, Sam Altman, Technopoly, Humancenteredai, Breakdown Of Trust, Ai And Work, Ai And The Future, Democracy, Futureofwork, Tech Politics, Tech Ethics, Future, Tech Addiction, Asi, Kids Phone Addiction, Best Ai Shows, Ai Regulations, Meta, Digitalgovernance, Bigtech, Ai And Happiness, Machinelearning, Screentime, Relationships, Ai Welfare, Ai Podcast, Cognitive Liberty, Infinite Scroll, Ai And Education, Kids And Ai, Ai Politics, Apple, Digitaldemocracy, Claude, Llms, Societalimpact, Artificial General Intelligence, Agi, Machines, Us Society, Politicaltechnology, Disinformation, Ai And Rights, Elon Musk, Government, Aiaccountability, Polarization, Jon Haidt, Algorithmicbias, Ai Personhood, Kids Online Safety, Superintelligence, Techandsociety, Automation, Design Ethics, News, Time Well Spent, Tech News, Society & Culture, Humane Design, Technology, Cht, Artificial Intelligence, Center For Humane Technology, The Social Dilemma Netflix, Philosophy, Human Downgrading, Aza Raskin, Attention Economy, Ethical Technology

4.81.5K Ratings

🗓️ 20 March 2025

⏱️ 51 minutes

🧾️ Download transcript

Summary

From Big Tobacco to Big Tech, powerful industries have perfected the art of manufacturing doubt about their harms. In this episode, historian Naomi Oreskes reveals the playbook corporations have used throughout history to weaponize uncertainty, fund fake experts, and shift blame to individuals—and what it all means for AI.

Transcript

Click on a timestamp to play from that location

0:00.0

Hey, everyone, it's Tristan.

0:05.6

It's Daniel. Welcome to your undivided attention.

0:10.4

So, Daniel, something I think about often is how throughout history society takes a lot of time to confront the harms caused by certain industries.

0:17.6

I think about Upton Sinclair writing about the meatpacking industry in the early 20th century. I think about Rachel Carson talking about Silent

0:24.0

Spring in the 1960s and the problems of pesticides or tobacco in the 1990s. And with social media,

0:29.1

we're seeing it happen again. The can just keeps getting kicked down the road. And with AI moving

0:33.1

so fast, it feels like the normal time that it takes us to react isn't compatible with doing something

0:37.6

soon enough. You know, we can become aware of serious problems, but if it takes too long to

0:41.9

respond, meaningful action won't follow. Totally. And I think this has to do with the way that we

0:46.5

manage uncertainty in our society. You know, with any new thing, with any industry, it's

0:51.8

important that we sit with the uncertainty as we discover what's happening.

0:55.3

But also, uncertainty is scary. And it's really easy for us to react to that fear that we

0:59.6

experience sitting with uncertainty by avoiding thinking or speaking about topics when we feel

1:05.4

uncertain. And then, you know, as a society, I often think about when we're uncertain about

1:10.5

what's true or who to trust. we struggle to make collective informed decisions.

1:16.6

And when we watch experts battling it out in public, when we hear conflicting narratives and strong emotions, it's easy to start to doubt what we think we know.

1:25.9

And it's important to recognize that that's not by accident.

1:28.8

You know, it's because companies and individuals with a lot of money and a lot of power

1:32.6

want to hide growing evidence of harm, and they do so with sophisticated and well-funded

1:37.5

campaigns that are specifically designed to create doubt and uncertainty.

1:42.3

So how do we sit with this?

1:43.8

Our guest today historian Naomi Oreskes

...

Please login to see the full transcript.

Disclaimer: The podcast and artwork embedded on this page are from Center for Humane Technology, and are the property of its owner and not affiliated with or endorsed by Tapesearch.

Generated transcripts are the property of Center for Humane Technology and are distributed freely under the Fair Use doctrine. Transcripts generated by Tapesearch are not guaranteed to be accurate.

Copyright © Tapesearch 2025.