meta_pixel
Tapesearch Logo
Log in
Lex Fridman Podcast

#431 – Roman Yampolskiy: Dangers of Superintelligent AI

Lex Fridman Podcast

Lex Fridman

Philosophy, Society & Culture, Science, Technology

4.7 β€’ 13K Ratings

πŸ—“οΈ 2 June 2024

⏱️ 143 minutes

🧾️ Download transcript

Summary

Roman Yampolskiy is an AI safety researcher and author of a new book titled AI: Unexplainable, Unpredictable, Uncontrollable. Please support this podcast by checking out our sponsors:
- Yahoo Finance: https://yahoofinance.com
- MasterClass: https://masterclass.com/lexpod to get 15% off
- NetSuite: http://netsuite.com/lex to get free product tour
- LMNT: https://drinkLMNT.com/lex to get free sample pack
- Eight Sleep: https://eightsleep.com/lex to get $350 off

EPISODE LINKS:
Roman's X: https://twitter.com/romanyam
Roman's Website: http://cecs.louisville.edu/ry
Roman's AI book: https://amzn.to/4aFZuPb

PODCAST INFO:
Podcast website: https://lexfridman.com/podcast
Apple Podcasts: https://apple.co/2lwqZIr
Spotify: https://spoti.fi/2nEwCF8
RSS: https://lexfridman.com/feed/podcast/
YouTube Full Episodes: https://youtube.com/lexfridman
YouTube Clips: https://youtube.com/lexclips

SUPPORT & CONNECT:
- Check out the sponsors above, it's the best way to support this podcast
- Support on Patreon: https://www.patreon.com/lexfridman
- Twitter: https://twitter.com/lexfridman
- Instagram: https://www.instagram.com/lexfridman
- LinkedIn: https://www.linkedin.com/in/lexfridman
- Facebook: https://www.facebook.com/lexfridman
- Medium: https://medium.com/@lexfridman

OUTLINE:
Here's the timestamps for the episode. On some podcast players you should be able to click the timestamp to jump to that time.
(00:00) - Introduction
(09:12) - Existential risk of AGI
(15:25) - Ikigai risk
(23:37) - Suffering risk
(27:12) - Timeline to AGI
(31:44) - AGI turing test
(37:06) - Yann LeCun and open source AI
(49:58) - AI control
(52:26) - Social engineering
(54:59) - Fearmongering
(1:04:49) - AI deception
(1:11:23) - Verification
(1:18:22) - Self-improving AI
(1:30:34) - Pausing AI development
(1:36:51) - AI Safety
(1:46:35) - Current AI
(1:51:58) - Simulation
(1:59:16) - Aliens
(2:00:50) - Human mind
(2:07:10) - Neuralink
(2:16:15) - Hope for the future
(2:20:11) - Meaning of life

Transcript

Click on a timestamp to play from that location

0:00.0

The following is a conversation with Roman Neumpolsky, an AI safety and security researcher

0:05.9

and author of a new book titled AI, Unexplainable, Unpredictable, Uncontrollable. He argues that there's almost 100% chance that AGI will

0:16.8

eventually destroy human civilization. As an aside, let me say that I will have many often technical conversations on the topic of

0:25.6

AI often with engineers building the state-of-the-art AI systems. I would say those folks put the infamous P-Doom or the probability of a-g-I-killing all humans

0:36.8

at around 1 to 20 percent. But it's also important to talk to folks who put that value at 70, 80, 90, and in the case of Roman at 99.99 and many more

0:49.6

9% percent.

0:51.9

I'm personally excited for the future and believe it will be a good one in part

0:56.7

because of the amazing technological innovation we humans create but we must

1:02.3

absolutely not do so with blinders on, ignoring the possible risks, including

1:08.6

existential risks of those technologies. That's what this conversation is about.

1:15.0

And now a quick few second mention of each sponsor, check them out in the

1:20.4

description. It's the best way to support this podcast. We got Yahoo

1:24.4

finance for investors, masterclass for learning, net suite for business,

1:29.4

element for hydration, and ate sleep sweet, sweet naps.

1:34.0

Choose wisely, my friends.

1:36.0

Also, if you want to get in touch with me or for whatever reason,

1:40.0

work with our amazing team, let's say,

1:42.0

just go to Lex Friedman.com

1:44.0

slash contact and now onto the full ad reads as always no ads in the middle I try to

1:49.7

make these interesting but if you must skip them friends please still check out our sponsors I enjoy

1:55.2

their stuff maybe you will too. This episode is brought to you by Yahoo Finance, a site that provides

2:02.4

financial management, reports, information, and news for investors.

...

Please login to see the full transcript.

Disclaimer: The podcast and artwork embedded on this page are from Lex Fridman, and are the property of its owner and not affiliated with or endorsed by Tapesearch.

Generated transcripts are the property of Lex Fridman and are distributed freely under the Fair Use doctrine. Transcripts generated by Tapesearch are not guaranteed to be accurate.

Copyright Β© Tapesearch 2025.