meta_pixel
Tapesearch Logo
Log in
TED Talks Daily

AI won't plateau β€” if we give it time to think | Noam Brown

TED Talks Daily

TED

Creativity, Ted Podcast, Ted Talks Daily, Business, Design, Inspiration, Society & Culture, Science, Technology, Education, Tech Demo, Ted Talks, Ted, Entertainment, Tedtalks

4.1 β€’ 11.9K Ratings

πŸ—“οΈ 1 February 2025

⏱️ 14 minutes

🧾️ Download transcript

Summary

To get smarter, traditional AI models rely on exponential increases in the scale of data and computing power. Noam Brown, a leading research scientist at OpenAI, presents a potentially transformative shift in this paradigm. He reveals his work on OpenAI's new o1 model, which focuses on slower, more deliberate reasoning β€” much like how humans think β€” in order to solve complex problems.

Hosted on Acast. See acast.com/privacy for more information.

Transcript

Click on a timestamp to play from that location

0:00.0

Hi, I'm Adam Grant, host of the podcast Rethinking, a show where I talk to some of today's

0:04.4

greatest thinkers about the unconventional ways they see the world. On rethinking, you'll get

0:09.0

surprising insights from scientists, leaders, artists, and more. People like Reese Witherspoon,

0:14.7

Malcolm Gladwell, and Yo-Yo Ma. Hear lessons to help you find success at work, build better

0:19.8

relationships, and more.

0:21.8

Find rethinking wherever you get your podcasts.

0:33.7

You're listening to TED Talks Daily, where we bring you new ideas to spark your curiosity every day.

0:39.8

I'm your host, Elise Hugh. It turns out it's not just humans who think fast and slow.

0:46.1

AI models also need time to think if we are wanting them to perform better.

0:51.6

In his 2024 talk, Open AI research scientist, Noam Brown,

0:56.8

shares new understanding about AI that can inform how to make models work better

1:02.0

and do more at scale.

1:04.8

It's coming up.

1:09.0

The incredible progress in AI over the past five years can be summarized in one word, scale.

1:15.6

Yes, there have been algorithmic advances, but the frontier models of today are still based on the same transformer architecture that was introduced in 2017.

1:32.0

And they are trained in a very similar way to the models that were trained in 2019.

1:39.0

The main difference is the scale of the data and compute that goes into these models.

1:45.0

In 2019, GBT2 cost about $5,000 to train. Every year since then, for the past five years, the models have gotten bigger, trained for longer, on more data.

1:54.0

And every year, they've gotten better.

1:57.0

But today's frontier models can cost hundreds of millions of dollars to train.

2:03.6

And there are reasonable concerns among some that AI will soon plateau or hit a wall.

2:11.6

After all, are we really going to train models that cost hundreds of billions of dollars?

...

Please login to see the full transcript.

Disclaimer: The podcast and artwork embedded on this page are from TED, and are the property of its owner and not affiliated with or endorsed by Tapesearch.

Generated transcripts are the property of TED and are distributed freely under the Fair Use doctrine. Transcripts generated by Tapesearch are not guaranteed to be accurate.

Copyright Β© Tapesearch 2025.