meta_pixel
Tapesearch Logo
Log in
Lex Fridman Podcast

#371 – Max Tegmark: The Case for Halting AI Development

Lex Fridman Podcast

Lex Fridman

Philosophy, Society & Culture, Science, Technology

4.7 β€’ 13K Ratings

πŸ—“οΈ 13 April 2023

⏱️ 174 minutes

🧾️ Download transcript

Summary

Max Tegmark is a physicist and AI researcher at MIT, co-founder of the Future of Life Institute, and author of Life 3.0: Being Human in the Age of Artificial Intelligence. Please support this podcast by checking out our sponsors:
- Notion: https://notion.com
- InsideTracker: https://insidetracker.com/lex to get 20% off
- Indeed: https://indeed.com/lex to get $75 credit

EPISODE LINKS:
Max's Twitter: https://twitter.com/tegmark
Max's Website: https://space.mit.edu/home/tegmark
Pause Giant AI Experiments (open letter): https://futureoflife.org/open-letter/pause-giant-ai-experiments
Future of Life Institute: https://futureoflife.org
Books and resources mentioned:
1. Life 3.0 (book): https://amzn.to/3UB9rXB
2. Meditations on Moloch (essay): https://slatestarcodex.com/2014/07/30/meditations-on-moloch
3. Nuclear winter paper: https://nature.com/articles/s43016-022-00573-0

PODCAST INFO:
Podcast website: https://lexfridman.com/podcast
Apple Podcasts: https://apple.co/2lwqZIr
Spotify: https://spoti.fi/2nEwCF8
RSS: https://lexfridman.com/feed/podcast/
YouTube Full Episodes: https://youtube.com/lexfridman
YouTube Clips: https://youtube.com/lexclips

SUPPORT & CONNECT:
- Check out the sponsors above, it's the best way to support this podcast
- Support on Patreon: https://www.patreon.com/lexfridman
- Twitter: https://twitter.com/lexfridman
- Instagram: https://www.instagram.com/lexfridman
- LinkedIn: https://www.linkedin.com/in/lexfridman
- Facebook: https://www.facebook.com/lexfridman
- Medium: https://medium.com/@lexfridman

OUTLINE:
Here's the timestamps for the episode. On some podcast players you should be able to click the timestamp to jump to that time.
(00:00) - Introduction
(07:34) - Intelligent alien civilizations
(19:58) - Life 3.0 and superintelligent AI
(31:25) - Open letter to pause Giant AI Experiments
(56:32) - Maintaining control
(1:25:22) - Regulation
(1:36:12) - Job automation
(1:45:27) - Elon Musk
(2:07:09) - Open source
(2:13:39) - How AI may kill all humans
(2:24:10) - Consciousness
(2:33:32) - Nuclear winter
(2:44:00) - Questions for AGI

Transcript

Click on a timestamp to play from that location

0:00.0

The following is a conversation with Max Tecmark, his third time in the podcast.

0:04.5

In fact, his first appearance was episode number one of this very podcast.

0:09.6

He is a physicist and artificial intelligence researcher at MIT,

0:14.0

co-founder of Future Left Institute, and author of Life 3.0,

0:18.8

being human in the age of artificial intelligence.

0:22.5

Most recently, he's a key figure in spearheading the open letter calling for a six-month pause

0:28.3

and giant AI experiments like training GPT-4.

0:33.1

The letter reads,

0:34.6

we're calling for a pause on training of models larger than GPT-4 for six months.

0:41.0

This does not imply a pause or ban on all AI research and development

0:45.0

or the use of systems that have already been placed in the market.

0:48.8

Our call is specific and addresses a very small pool of actors

0:53.6

who possesses this capability.

0:56.0

The letter has been signed by over 50,000 individuals,

0:58.9

including 1800 CEOs and over 1500 professors.

1:03.6

Signatories include Joshua Bengeo, Stuart Russell, Elon Musk,

1:07.9

Steve Wozniak, Yvonneau Harari, Andrew Yang, and many others.

1:12.8

This is a defining moment in the history of human civilization,

1:16.4

where the balance of power between human and AI begins to shift.

1:21.9

And Max's mind and his voice is one of the most valuable and powerful in a time like this.

1:28.5

His support, his wisdom, his friendship, has been a gift on forever, deeply grateful for.

1:36.6

And now, a quick few second mention of each sponsor.

...

Please login to see the full transcript.

Disclaimer: The podcast and artwork embedded on this page are from Lex Fridman, and are the property of its owner and not affiliated with or endorsed by Tapesearch.

Generated transcripts are the property of Lex Fridman and are distributed freely under the Fair Use doctrine. Transcripts generated by Tapesearch are not guaranteed to be accurate.

Copyright Β© Tapesearch 2025.