meta_pixel
Tapesearch Logo
Log in
Forbes Talks

AI Chip Startup Cerebras Systems Looks To Challenge Nvidia's Dominance

Forbes Talks

Forbes Media LLC

Business News, Forbes, Business, News, Economics, Entrepreneurship, Politics, Policy, Breaking News

54 Ratings

🗓️ 19 February 2025

⏱️ 22 minutes

🧾️ Download transcript

Summary

Forbes Assistant Managing Editor Katharine Schwab talks with Cerebras Systems' CEO and cofounder Andrew Feldman about his startup's AI chip, the impact of China's DeepSeek and its implications for the global AI landscape.

See Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.

Transcript

Click on a timestamp to play from that location

0:00.0

Hi everyone. My name is Catherine Schwab. I'm an assistant managing editor here at Forbes

0:08.0

covering technology. And I'm so excited to be here today with Andrew Feldman, the co-founder and

0:13.7

CEO of Cerebrus, which is an AI chip company. It's about eight years old and is planning an IPO for later this year,

0:22.3

which is exciting. So Andrew, thanks for joining us. Thank you for having. Let's talk about

0:28.8

chips and AI, which is really hot right now. Normally the first company that comes to

0:34.8

mine when you talk about chips and AI is Invidia, obviously.

0:39.3

Can you talk about how Cerebrus' chips and kind of approach to chips is different from

0:47.3

NVIDIA?

0:48.3

Sure.

0:49.3

Invidia has its heritage in graphics.

0:53.3

They pioneered a part called the GPU, which stands for

0:56.7

graphics processing unit. AI work is slightly different, and we began with a clean sheet

1:03.1

of paper and designed a chip and a system and software just for the purpose of AI.

1:12.4

And as a result, we're orders of magnitude faster and we use a tiny fraction of the power

1:18.3

per unit compute.

1:19.9

Okay, so in terms of practical speed, what does that mean for if I'm, well, I know you're not powering chat GPT per se

1:31.3

because that's open AI. But how much faster are we talking in kind of layman strengths?

1:38.1

Well, if you ask, say, deep seek, which we are serving right now, if you ask deep seek a question and you

1:47.5

ask the same question to open AI 01 mini one of their top models we will be

1:54.3

about 57 times faster to get you an answer so yeah it's a lot. And if you compare head to head with one of, say,

2:05.6

meta's models, Lama, 70 billion per meter, which is one of the most popular models,

2:11.4

will be 70 times faster. So it's head and shoulders faster. The difference for the user is between a second and 24 seconds to get an answer back.

...

Please login to see the full transcript.

Disclaimer: The podcast and artwork embedded on this page are from Forbes Media LLC, and are the property of its owner and not affiliated with or endorsed by Tapesearch.

Generated transcripts are the property of Forbes Media LLC and are distributed freely under the Fair Use doctrine. Transcripts generated by Tapesearch are not guaranteed to be accurate.

Copyright © Tapesearch 2025.