4.7 • 618 Ratings
🗓️ 25 July 2023
⏱️ 66 minutes
🧾️ Download transcript
(Overtime segment available to paid subscribers below the paywall.)
2:21 Where AI fits in Connor’s tech threat taxonomy 15:10 What does the “general” in artificial general intelligence actually mean? 22:26 What should worry us about AI right now? 30:01 Connor: Don’t put your trust in AI companies 39:28 The promise and perils of open-sourcing AI 49:52 Why "interpretability" matters 56:31 What would an aligned AI actually look like? 1:01:00 Bridging the technology wisdom gap
Robert Wright (Bloggingheads.tv, The Evolution of God, Nonzero, Why Buddhism Is True) and Connor Leahy (Conjecture). Recorded July 11, 2023.
Comments on BhTV: http://bloggingheads.tv/videos/66476 Twitter: https://twitter.com/NonzeroPods
Click on a timestamp to play from that location
0:00.0 | You're listening to Robert Wright's Non-Zero Podcast. |
0:33.7 | Hi, Connor. |
0:35.2 | Hey, Bob. |
0:36.3 | How you doing? |
0:56.4 | Pretty good. Except the heat's killing me here in London. Yeah, I hear heat is a growing problem on this planet. I've caught wind of that trend. So if I've heard. Let me introduce this. I'm Robert Wright, publisher of the Non-Zero Newsletter. This is the Non-Zero podcast. You are Connor Leahy. You pronounce it Leahy, right? |
1:07.5 | Yeah, that's right. And you are CEO of Conjecture, an AI company, whose motto is, we make AGI safe, |
1:15.2 | where AGI stands for artificial general intelligence, something that's not with us yet, but many people expect to be with us in a few years. |
1:27.5 | So you're somebody who's very concerned about risks posed by AI. And I gather many of the other employees at Conjecture. |
1:29.5 | I noticed that there was an internal survey done. |
1:30.5 | It's on your website. |
1:39.8 | And most of the people surveyed there think the chances that AI will lead to extinction, either via the AI itself or via bad actors using it. |
1:43.9 | The extinction of the human species, |
1:47.4 | most people at your company think the chances of that are greater than 50%, |
1:50.9 | higher than the AI community average. |
1:54.0 | There I think about half of the people think is greater than 10%, |
1:56.7 | which is not nothing when you're talking about |
2:01.0 | total and utter annihilation. |
2:04.5 | And you're trying to do something about this, |
2:07.5 | trying to do something about the so-called alignment problem, |
2:10.6 | making sure that AI is aligned with human values and interests. |
2:14.9 | Before we get into what you're doing, |
2:20.4 | I wanted to kind of flesh out the nature of your concerns a little more. So I can see concerns breaking down into maybe three basic categories. |
... |
Please login to see the full transcript.
Disclaimer: The podcast and artwork embedded on this page are from Nonzero, and are the property of its owner and not affiliated with or endorsed by Tapesearch.
Generated transcripts are the property of Nonzero and are distributed freely under the Fair Use doctrine. Transcripts generated by Tapesearch are not guaranteed to be accurate.
Copyright © Tapesearch 2025.