meta_pixel
Tapesearch Logo
Log in
TechCheck

TechCheck Takes: How DeepSeek supercharged AI’s distillation problem 2/25/25

TechCheck

CNBC

Disruptors, Investing, Faang, Technology, Business, Management, Cnbc, Tech

4.856 Ratings

🗓️ 25 February 2025

⏱️ 34 minutes

🧾️ Download transcript

Summary

Silicon Valley is reckoning with an AI development technique that could upend the leaderboard. Distillation is the idea that a small team can make an advanced AI model by extracting knowledge from a larger one. DeepSeek didn’t invent the method, but its use roiled the markets and woke the AI world up to its potential. It’s now enabling startups to compete at the cutting edge, and is deadly for the biggest AI players’ competitive edges. This video includes an interview with Glean CEO Arvind Jain.

Transcript

Click on a timestamp to play from that location

0:00.0

A trillion dollar sell-off.

0:01.6

Front and center this hour, the deep sell-off.

0:03.8

A major tech sell-off today.

0:06.0

DeepSeek sell-off is still really pressuring tech.

0:09.1

Triggered on its face by one Chinese startup.

0:12.5

Let's talk about DeepSeek because it is mind-blowing and it is shaking this entire industry to its core.

0:19.3

But underneath, stemming from growing fears around a technique that could upend the AI leaderboard.

0:25.6

A technique called distillation.

0:27.6

Distillation.

0:28.6

Their work is being distilled.

0:30.6

Distillation is the idea that a small team with virtually no resources can make an advanced model

0:36.6

by essentially extracting knowledge from a larger one.

0:40.4

DeepSeek didn't invent distillation, but it woke up the AI world to its disruptive potential.

0:46.3

I'm Deer Draboza with the Tech Check Take, AI's distillation problem. AIM AI models are more accessible than ever.

1:06.0

That is the upshot from a technique in AI development that has broken into the main narrative called distillation.

1:12.4

Jeffrey Hinton dubbed the godfather of AI, he was the first to coin the term in a 2015 paper

1:17.7

while working as an engineer at Google. Writing that distillation was a way to transfer the knowledge

1:23.3

from a large cumberson model to a small model that's more suitable for deployment.

1:29.1

Fast forward to today and upstarts, they're using this method to challenge industry giants

1:33.8

with years of experience and billions in funding.

1:37.0

Put simply, here's how it works.

1:38.9

A leading tech company invests years and millions of dollars developing a top-tier AI model

...

Please login to see the full transcript.

Disclaimer: The podcast and artwork embedded on this page are from CNBC, and are the property of its owner and not affiliated with or endorsed by Tapesearch.

Generated transcripts are the property of CNBC and are distributed freely under the Fair Use doctrine. Transcripts generated by Tapesearch are not guaranteed to be accurate.

Copyright © Tapesearch 2025.