4.8 • 4.4K Ratings
🗓️ 2 March 2020
⏱️ 100 minutes
🧾️ Download transcript
Anyone who has read histories of the Cold War, including the Cuban Missile Crisis and the 1983 nuclear false alarm, must be struck by how incredibly close humanity has come to wreaking incredible destruction on itself. Nuclear war was the first technology humans created that was truly capable of causing such harm, but the list of potential threats is growing, from artificial pandemics to runaway super-powerful artificial intelligence. In response, today’s guest Martin Rees and others founded the Cambridge Centre for the Study of Existential Risk. We talk about what the major risks are, and how we can best reason about very tiny probabilities multiplied by truly awful consequences. In the second part of the episode we start talking about what humanity might become, as well as the prospect of life elsewhere in the universe, and that was so much fun that we just kept going.
Support Mindscape on Patreon.
Lord Martin Rees, Baron of Ludlow, received his Ph.D. in physics from University of Cambridge. He is currently Emeritus Professor of Cosmology and Astrophysics at the University of Cambridge, as well as Astronomer Royal of the United Kingdom. He was formerly Master of Trinity College and President of the Royal Society. Among his many awards are the Heineman Prize for Astrophysics, the Gruber Prize in Cosmology, the Crafoord Prize, the Michael Faraday Prize, the Templeton Prize, the Isaac Newton Medal, the Dirac Medal, and the British Order of Merit. He is a co-founder of the Centre for the Study of Existential Risk.
See Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.
Click on a timestamp to play from that location
0:00.0 | Hello everyone, welcome to the Mindscape Podcast. |
0:03.2 | I'm your host Sean Carroll. |
0:04.9 | And today we're going to have a thought-provoking if perhaps slightly depressing episode, or at least |
0:11.2 | slightly putting us in the mode of worrying about really profound things. |
0:16.7 | That is the concept of existential risks, or even lesser than that, catastrophic or extreme |
0:23.4 | risks that we face as a species. |
0:26.3 | So we all know that something happened in the 20th century. |
0:30.7 | We gained the technological ability to really do enormous harm to ourselves as a species. |
0:38.0 | There was always times back in history when human beings could harm each other. |
0:43.1 | But these days we can imagine human beings truly wreaking havoc on the whole planet or |
0:50.1 | the whole species. |
0:51.7 | That's what we mean by extreme or catastrophic or existential risks. |
0:56.0 | So today's guest is Martin Reese. |
0:58.4 | That's Lord Reese, Baron of Ludlow. |
1:00.6 | He is officially a Lord in the British hierarchy. |
1:03.7 | There he actually sits in the House of Lords and votes and so forth. |
1:08.3 | But Martin is also one of the leading theoretical astrophysicists of our age. |
1:13.6 | She's done enormously good work in high energy astrophysics, understanding black holes and |
1:18.4 | galaxies and things like that. |
1:20.5 | But over the last decade or two he's gained an interest in these big questions of human |
1:26.2 | life and where humanity is going toward the future. |
1:29.3 | So he's one of the co-founders of the Center for the Study of Existential Risks at Cambridge |
... |
Please login to see the full transcript.
Disclaimer: The podcast and artwork embedded on this page are from Sean Carroll | Wondery, and are the property of its owner and not affiliated with or endorsed by Tapesearch.
Generated transcripts are the property of Sean Carroll | Wondery and are distributed freely under the Fair Use doctrine. Transcripts generated by Tapesearch are not guaranteed to be accurate.
Copyright © Tapesearch 2025.