DiscoverMixture of ExpertsEpisode 40: DeepSeek facts vs hype, model distillation, and open source competition
Episode 40: DeepSeek facts vs hype, model distillation, and open source competition

Episode 40: DeepSeek facts vs hype, model distillation, and open source competition

Update: 2025-01-31
Share

Description

Let’s bust some early myths about DeepSeek. In episode 40 of Mixture of Experts, join host Tim Hwang along with experts Aaron Baughman, Chris Hay and Kate Soule. Last week, we covered the release of DeepSeek-R1; now that the entire world is up to speed, let’s separate the facts from the hype. Next, what is model distillation and why does it matter for competition in AI? Finally, Sam Altman among other tech CEOs shared his response to DeepSeek. Will R1 radically change the open-source strategy of other tech giants? Find out all this and more on Mixture of Experts.


00:01 – Intro

00:41 – DeepSeek facts vs hype

21:00 – Model distillation

31:21 – Open source and OpenAI


The opinions expressed in this podcast are solely those of the participants and do not necessarily reflect the views of IBM or any other organization or entity.


Comments 
In Channel
loading
00:00
00:00
x

0.5x

0.8x

1.0x

1.25x

1.5x

2.0x

3.0x

Sleep Timer

Off

End of Episode

5 Minutes

10 Minutes

15 Minutes

30 Minutes

45 Minutes

60 Minutes

120 Minutes

Episode 40: DeepSeek facts vs hype, model distillation, and open source competition

Episode 40: DeepSeek facts vs hype, model distillation, and open source competition

IBM