a cinematic photo of a whale labeled 'DEEPSEEK' falling out of the sky onto a fishing boat labeled 'OpenAI'
Created using Ideogram 2.0 Turbo with the prompt, "a cinematic photo of a whale labeled 'DEEPSEEK' falling out of the sky onto a fishing boat labeled 'OpenAI'"

A Quant Firm’s Side Project is Beating Tech Giants at AI

Chinese quant firm Fantang Quant has been quietly running circles around major tech companies with their AI project DeepSeek. While Meta and OpenAI spend billions training their models, DeepSeek trained theirs for just $5.6M using GPUs they already had for trading operations.

Their latest model, DeepSeek V3, matches or beats GPT-4o and Claude 3.5 on key benchmarks. Their R1 model performs on par with OpenAI’s o1 at a tiny fraction of the cost. This proves two important points:

1. Open source AI is only 3-6 months behind closed source systems
2. Cost efficiency matters more than raw spending

DeepSeek’s success highlights why initiatives like Stargate funding are so important for maintaining America’s competitive edge in AI. We need more investment in data centers and energy infrastructure to support AI development at scale.

What impresses me most about DeepSeek’s approach is how they turned an existing asset – GPUs used for trading – into a competitive advantage in AI development. While other companies throw billions at the problem, DeepSeek found a way to achieve similar results through smart resource allocation.

For more on the rapidly changing dynamics between open and closed source AI models, check out my analysis of MiniMax’s recent open source release here.

The lesson? You don’t need endless capital to compete in AI – you need to be strategic about using the resources you already have.