4.7 • 13K Ratings
🗓️ 20 October 2018
⏱️ 43 minutes
🧾️ Download transcript
Click on a timestamp to play from that location
0:00.0 | Welcome to the Artificial Intelligence Podcast. My name is Lex Friedman. I'm a research scientist at MIT. |
0:06.4 | If you enjoy this podcast, please rate it on iTunes or your podcast provider of choice. |
0:11.5 | I simply connect with me on Twitter and other social networks at Lex Friedman spelled F-R-I-D. |
0:19.1 | Today is a conversation with Yoshio Benjero. Along with Jeff Hinton and Yon Lakun, |
0:24.4 | he's considered one of the three people most responsible for the advancement of deep learning |
0:29.3 | during the 1990s and the 2000s and now. Sighted 139,000 times, he has been integral to some of |
0:39.3 | the biggest breakthroughs in AI over the past three decades. |
1:00.0 | What difference between biological and neural networks and artificial neural networks |
1:04.3 | is most mysterious, captivating and profound for you? First of all, there's so much we don't know |
1:12.0 | about biological neural networks and that's very mysterious and captivating because maybe it holds |
1:17.9 | the key to improving artificial neural networks. One of the things I studied recently |
1:26.8 | is something that we don't know how biological neural networks do but would be really useful |
1:33.1 | for artificial ones is the ability to do credit assignment through very long time spans. |
1:42.4 | There are things that we can, in principle, do with artificial neural nets but it's not very |
1:48.0 | convenient and it's not biologically plausible and this mismatch, I think, this kind of mismatch, |
1:55.2 | maybe an interesting thing to study to understand better how brains might do these things because we |
2:02.9 | don't have good corresponding theories with artificial neural nets and be maybe provide new ideas |
2:10.8 | that we could explore about things that brain do differently and that we could incorporate in |
2:19.3 | artificial neural nets. So let's break credit assignment up a little bit. So what? |
2:24.2 | It's a beautifully technical term but it could incorporate so many things. So is it more on the |
2:31.0 | RNN memory side? Is that thinking like that or is it something about knowledge, building up common |
2:37.4 | sense knowledge over time or is it more in the reinforcement learning sense that you're picking |
... |
Please login to see the full transcript.
Disclaimer: The podcast and artwork embedded on this page are from Lex Fridman, and are the property of its owner and not affiliated with or endorsed by Tapesearch.
Generated transcripts are the property of Lex Fridman and are distributed freely under the Fair Use doctrine. Transcripts generated by Tapesearch are not guaranteed to be accurate.
Copyright © Tapesearch 2025.