In a recent Bloomberg interview, OpenAI CEO Sam Altman praised DeepSeek’s tech team and acknowledged their talent
This reflection underscores OpenAI’s belief in its own technological trajectory while recognizing DeepSeek’s rise as a serious rival.
🔍 Context: Why DeepSeek Sparked a Global Stir
- DeepSeek is a Chinese AI startup whose R1 and V3 models achieved reasoning benchmarks comparable to OpenAI’s standards—but at a fraction of the typical cost. Its R1 app overtook ChatGPT as the most downloaded on the Apple iOS App Store in early 2025
- The firm leverages software innovations like Mixture-of-Experts (MoE), Multi-Head Latent Attention (MLA), and low-level GPU optimizations to make AI more resource-efficient and cost-effective, reportedly training models for under $6 million versus OpenAI’s $100M+ investment
🧩 What Altman’s Comment Implies
- Paraphrase: While DeepSeek’s achievements show technical competence, OpenAI remains confident in its own efficiency and scale. Altman suggests there is no leapfrog innovation in DeepSeek’s architecture.
- Strategic posture: Acknowledging competition but reaffirming OpenAI’s leadership in creating and optimizing large-scale models.
📊 Industry Reactions: Mixed Views on DeepSeek
| Commentator | Viewpoint |
|---|---|
| Aidan Gomez, Cohere CEO | DeepSeek’s simplicity and lack of enterprise customization may limit its utility; organizations want tailored, secure models. |
| Meta’s CTO | While praising open-source progress, positioned DeepSeek’s efficiency as beneficial—but cautioned against overhyping its impact. |
| Reddit discussions | Mixed sentiment: users reported DeepSeek’s good coding and reasoning skills over GPT‑based models, yet critics pointed to downtime, hallucinations, and hype. |
🔭 Broader AI Landscape: Efficiency vs. Capability
- Markets shake from shock: Tech stocks including Nvidia saw significant volatility in response, prompting talk of a “Sputnik moment” in AI New York Magazine
- Jevons’ paradox in play: Greater efficiency may spur overall usage, leading to higher resource consumption. This dynamic suggests even efficient AI models will demand scale.
- Open-source momentum: Meta’s Yann LeCun and others highlight the trend: open collaboration is allowing smaller players to compete—and thrive—in AI innovations.
✅ Summary Points
- Sam Altman noted DeepSeek’s talent but disagreed that it introduced a superior efficiency paradigm.
- DeepSeek is reshaping perceptions of “cheap AI,” but remains controversial in its approach and claims.
- OpenAI and competitors continue to focus on full-stack AI leadership, combining scale, infrastructure, and novel architecture.


