meta_pixel
Tapesearch Logo
Log in
Your Undivided Attention

Can Myth Teach Us Anything About the Race to Build Artificial General Intelligence? With Josh Schrei

Your Undivided Attention

Center for Humane Technology

Tristan Harris, Socialjustice, Tech Podcast, Character Ai, Little Tech, Ai History, Silicon Valley, Privacy, Daniel Barcay, Addiction, Ai Addiction, Chat Bots, Children And Tech, Tech Policy, Responsibleai, Tech, New Ai Shows, Screen Time, Open Ai, Elections, Kids Tech, Google, Ai And Kids, Politicsandai, Politics, Anthropic, Dataprivacy, Humans, Tech And Relationships, Us Politics, Ai And Relationships, Aiandhumanrights, Civictech, Aiinsociety, Surveillance, Sam Altman, Technopoly, Humancenteredai, Breakdown Of Trust, Ai And Work, Ai And The Future, Democracy, Futureofwork, Tech Politics, Tech Ethics, Future, Tech Addiction, Asi, Kids Phone Addiction, Best Ai Shows, Ai Regulations, Meta, Digitalgovernance, Bigtech, Ai And Happiness, Machinelearning, Screentime, Relationships, Ai Welfare, Ai Podcast, Cognitive Liberty, Infinite Scroll, Ai And Education, Kids And Ai, Ai Politics, Apple, Digitaldemocracy, Claude, Llms, Societalimpact, Artificial General Intelligence, Agi, Machines, Us Society, Politicaltechnology, Disinformation, Ai And Rights, Elon Musk, Government, Aiaccountability, Polarization, Jon Haidt, Algorithmicbias, Ai Personhood, Kids Online Safety, Superintelligence, Techandsociety, Automation, Design Ethics, News, Time Well Spent, Tech News, Society & Culture, Humane Design, Technology, Cht, Artificial Intelligence, Center For Humane Technology, The Social Dilemma Netflix, Philosophy, Human Downgrading, Aza Raskin, Attention Economy, Ethical Technology

4.81.5K Ratings

🗓️ 18 January 2024

⏱️ 36 minutes

🧾️ Download transcript

Summary

What can tales of mythology and magic teach us about the AI race? Mythologist Josh Schrei explains how looking at foundational cultural stories could guide ethical tech development.

Transcript

Click on a timestamp to play from that location

0:00.0

Hey everyone this is Aza and this is Tristan

0:06.8

Welcome to your undivided attention and today we're going to do a very different kind of episode on your undivided attention.

0:14.4

You know, we're all interested in how does AI and humanity go well.

0:18.5

And we often talk about solutions to that, you know, technical solutions like dealing with the chips or the training data that gets used to make AI or policy solutions.

0:27.0

But there's this kind of deeper question, which is what is the drive to make AI in the first place?

0:32.0

And what solutions would be enough? drive to make AI in the first place?

0:33.0

And what solutions would be enough

0:34.9

when the drive behind building it

0:36.7

is almost religious or mythical in nature?

0:40.9

Now a lot of you might think this is sounding a little bit weird for hearing divided attention, but a friend of CHTs kind of went behind the scenes and talked to a lot of the major players at the AI Labs.

0:50.0

He wrote me an email about the conversations that he's been having,

0:52.7

summarizing why everyone at the end of the day is building this.

0:56.0

Here's what he said.

0:57.2

In the end, a lot of the tech people I'm talking to,

1:00.2

when I really grill them on it, They retreat into number one, determinism. Number two, the

1:06.1

inevitable replacement of biological life with digital life. And number three, that being a good thing

1:12.0

anyways. And he goes on to say that these AI leaders have an emotional desire to meet and speak to the most intelligent entity they've ever met and they have some egoreligious intuition that they'll somehow be a part of it.

1:25.0

It's thrilling to start an exciting fire. They feel they will die either way so they'd prefer to light it

1:31.0

and see what happens.

1:33.2

Now the quote I just read does not I think describe why everyone is pursuing AI.

1:37.5

In fact I'd say most people are not driven by this.

1:40.4

But there's a handful of people at the kind of core center of some of the frontier AGI development that I think do have this psychology.

...

Please login to see the full transcript.

Disclaimer: The podcast and artwork embedded on this page are from Center for Humane Technology, and are the property of its owner and not affiliated with or endorsed by Tapesearch.

Generated transcripts are the property of Center for Humane Technology and are distributed freely under the Fair Use doctrine. Transcripts generated by Tapesearch are not guaranteed to be accurate.

Copyright © Tapesearch 2025.