meta_pixel
Tapesearch Logo
Log in
Lex Fridman Podcast

#368 – Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

Lex Fridman Podcast

Lex Fridman

Philosophy, Society & Culture, Science, Technology

4.7 β€’ 13K Ratings

πŸ—“οΈ 30 March 2023

⏱️ 203 minutes

🧾️ Download transcript

Summary

Eliezer Yudkowsky is a researcher, writer, and philosopher on the topic of superintelligent AI. Please support this podcast by checking out our sponsors:
- Linode: https://linode.com/lex to get $100 free credit
- House of Macadamias: https://houseofmacadamias.com/lex and use code LEX to get 20% off your first order
- InsideTracker: https://insidetracker.com/lex to get 20% off

EPISODE LINKS:
Eliezer's Twitter: https://twitter.com/ESYudkowsky
LessWrong Blog: https://lesswrong.com
Eliezer's Blog page: https://www.lesswrong.com/users/eliezer_yudkowsky
Books and resources mentioned:
1. AGI Ruin (blog post): https://lesswrong.com/posts/uMQ3cqWDPHhjtiesc/agi-ruin-a-list-of-lethalities
2. Adaptation and Natural Selection: https://amzn.to/40F5gfa

PODCAST INFO:
Podcast website: https://lexfridman.com/podcast
Apple Podcasts: https://apple.co/2lwqZIr
Spotify: https://spoti.fi/2nEwCF8
RSS: https://lexfridman.com/feed/podcast/
YouTube Full Episodes: https://youtube.com/lexfridman
YouTube Clips: https://youtube.com/lexclips

SUPPORT & CONNECT:
- Check out the sponsors above, it's the best way to support this podcast
- Support on Patreon: https://www.patreon.com/lexfridman
- Twitter: https://twitter.com/lexfridman
- Instagram: https://www.instagram.com/lexfridman
- LinkedIn: https://www.linkedin.com/in/lexfridman
- Facebook: https://www.facebook.com/lexfridman
- Medium: https://medium.com/@lexfridman

OUTLINE:
Here's the timestamps for the episode. On some podcast players you should be able to click the timestamp to jump to that time.
(00:00) - Introduction
(05:19) - GPT-4
(28:00) - Open sourcing GPT-4
(44:18) - Defining AGI
(52:14) - AGI alignment
(1:35:06) - How AGI may kill us
(2:27:27) - Superintelligence
(2:34:39) - Evolution
(2:41:09) - Consciousness
(2:51:41) - Aliens
(2:57:12) - AGI Timeline
(3:05:11) - Ego
(3:11:03) - Advice for young people
(3:16:21) - Mortality
(3:18:02) - Love

Transcript

Click on a timestamp to play from that location

0:00.0

The following is a conversation with Elias Zaryyit Kowski, a legendary researcher, writer

0:05.0

and philosopher on the topic of artificial intelligence, especially superintelligent

0:10.4

AGI and its threat to human civilization.

0:15.5

And now a quick few second mention of your sponsor, check them out in the description

0:19.8

as the best way to support this podcast.

0:22.4

We got Linode for Linux systems, House of Academias for Healthy Midday Snacks, and Insight

0:28.4

Tracker for Biological Monitoring.

0:31.0

Choose wisely my friends.

0:32.6

Also, if you want to work with our team or always hiring, go to lexfreedman.com slash

0:37.9

hiring.

0:39.2

And now onto the full ad reads, as always, no ads in the middle, I try to make these

0:42.9

interesting, but if you must skip them, please still check out the sponsors.

0:47.4

I enjoy their stuff, maybe you will too.

0:51.0

This episode is sponsored by Linode, not called Akamai, and they're incredible Linux virtual

0:57.9

machines.

0:58.9

It's an awesome computer infrastructure that lets you develop, deploy and scale, whatever

1:03.8

applications you build faster and easier.

1:05.8

I love using them.

1:07.0

They create this incredible platform like AWS, but better in every way I know, including

1:13.3

lower cost.

1:14.3

It's incredible human-based, in this age of AI, it's a human-based customer service 24

1:21.2

7365.

...

Please login to see the full transcript.

Disclaimer: The podcast and artwork embedded on this page are from Lex Fridman, and are the property of its owner and not affiliated with or endorsed by Tapesearch.

Generated transcripts are the property of Lex Fridman and are distributed freely under the Fair Use doctrine. Transcripts generated by Tapesearch are not guaranteed to be accurate.

Copyright Β© Tapesearch 2025.