meta_pixel
Tapesearch Logo
Log in
Into the Impossible With Brian Keating

Nick Bostrom: Superintelligence (#256)

Into the Impossible With Brian Keating

Brian Keating

Science, Physics, Natural Sciences

4.71.1K Ratings

🗓️ 7 September 2022

⏱️ 67 minutes

🧾️ Download transcript

Summary

Nick Bostrom is a Swedish-born philosopher at the University of Oxford known for his work on existential risk, the anthropic principle, human enhancement ethics, superintelligence risks, and the reversal test. In 2011, he founded the Oxford Martin Program on the Impacts of Future Technology and is the founding director of the Future of Humanity Institute at Oxford University. In 2009 and 2015, he was included in Foreign Policy's Top 100 Global Thinkers list. Bostrom is the author of over 200 publications, and has written two books and co-edited two others. The two books he has authored are Anthropic Bias: Observation Selection Effects in Science and Philosophy (2002) and Superintelligence: Paths, Dangers, Strategies (2014). Superintelligence was a New York Times bestseller, was recommended by Elon Musk and Bill Gates among others, and helped to popularize the term "superintelligence". Bostrom believes that superintelligence, which he defines as "any intellect that greatly exceeds the cognitive performance of humans in virtually all domains of interest," is a potential outcome of advances in artificial intelligence. He views the rise of superintelligence as potentially highly dangerous to humans, but nonetheless rejects the idea that humans are powerless to stop its negative effects. In his book Superintelligence, Professor Bostrom asks the questions: What happens when machines surpass humans in general intelligence? Will artificial agents save or destroy us? Nick Bostrom lays the foundation for understanding the future of humanity and intelligent life. The human brain has some capabilities that the brains of other animals lack. It is to these distinctive capabilities that our species owes its dominant position. If machine brains surpassed human brains in general intelligence, then this new superintelligence could become extremely powerful - possibly beyond our control. As the fate of the gorillas now depends more on humans than on the species itself, so would the fate of humankind depend on the actions of the machine superintelligence. But we have one advantage: we get to make the first move. Will it be possible to construct a seed Artificial Intelligence, to engineer initial conditions so as to make an intelligence explosion survivable? How could one achieve a controlled detonation? https://www.fhi.ox.ac.uk/ https://nickbostrom.com/ Related Episodes: David Chalmers elaborates on the simulation hypothesis, virtual reality, and his philosophy of consciousness. https://youtu.be/ywjbbQXAFic Sabine Hossenfelder on Existential Physics: https://youtu.be/g00ilS6tBvs Connect with me: 🏄‍♂️ Twitter: https://twitter.com/DrBrianKeating 📸 Instagram: https://instagram.com/DrBrianKeating 🔔 Subscribe https://www.youtube.com/DrBrianKeating?sub_confirmation=1 📝 Join my mailing list; just click here http://briankeating.com/list ✍️ Detailed Blog posts here: https://briankeating.com/blog.php 🎙️ Listen on audio-only platforms: https://briankeating.com/podcast Join Shortform through my link Shortform.com/impossible and you’ll receive 5 days of unlimited access and an additional 20% discounted annual subscription! Subscribe to the Jordan Harbinger Show for amazing content from Apple’s best podcast of 2018! Can you do me a favor? Please leave a rating and review of my Podcast: 🎧 On Apple devices, click here, scroll down to the ratings and leave a 5 star rating and review The INTO THE IMPOSSIBLE Podcast. 🎙️On Spotify it’s here 🎧 On Audible it’s here Other ways to rate here: https://briankeating.com/podcast Support the podcast on Patreon or become a Member on YouTube Learn more about your ad choices. Visit megaphone.fm/adchoices

Transcript

Click on a timestamp to play from that location

0:00.0

Welcome loyal into the impossible listeners or maybe you're just simulated listeners.

0:12.8

Maybe I'm simulated.

0:14.6

Who's to know?

0:15.8

All I know is that this interview with Nick Bostrom was a bucketless conversation, which

0:20.8

we went deep talking about the most important topics in all of science perhaps

0:27.0

the nature of existence. Whether or not we are simulations depends on whether or not you take seriously what's called

0:35.0

Super Intelligence. We talked about that from his phenomenal 2014 book of the same name.

0:40.2

We also discussed a wide variety of topics including whether or not the Earth should

0:46.9

fear depopulation more than fearing aliens or perhaps what is called ominously the Great Filter.

0:55.0

Talked about a tremendous variety of subjects ranging from the risk of super intelligent AI

1:02.0

to whether or not humanity has witnessed its first global scale

1:05.8

phenomena for the first time documented by high performance computing devices, cameras, and

1:11.0

humans, if you assume that we are human and not just simulated, if we're simulated,

1:15.8

then it really shouldn't be such a big deal to get my kids to fold their laundry.

1:20.0

If these Alphafold computer algorithms and machines can do the same thing.

1:24.2

Why can't my kids fold their d'- All right, I'm going off on a tangent.

1:27.5

Sorry about that, but this was really a delightful discussion with Nick Bostromo's

1:32.0

originally born in Sweden, along with past guests, Max Taggmart,

1:37.0

and future guests, the Abba Group.

1:39.0

Now hopefully, we can get on some musical entertainment

1:42.0

that's a little bit younger than

1:43.8

Abba but he is currently a professor at Oxford University where he has the

...

Please login to see the full transcript.

Disclaimer: The podcast and artwork embedded on this page are from Brian Keating, and are the property of its owner and not affiliated with or endorsed by Tapesearch.

Generated transcripts are the property of Brian Keating and are distributed freely under the Fair Use doctrine. Transcripts generated by Tapesearch are not guaranteed to be accurate.

Copyright © Tapesearch 2025.