Ultraman questions DeepSeek's training cost: how many zeros are missing?

OpenAI CEO Sam Altman criticized the claim that DeepSeek trained its AI tools at a low cost of only $6 million in an interview, saying, "I am extremely skeptical about DeepSeek's cost figures. It's like there are a few zeros missing.

During his trip to India, Ultraman was interviewed by Indian media on Wednesday to discuss a series of core issues related to artificial intelligence, including the rise of competitor DeepSeek.


When the news of the Chinese model DeepSeek came out, what was your first reaction? At least the headline was that they managed to train their model at a lower cost, although it was later discovered that this was not the case.


Altman mocked DeepSeek. He replied, 'I am extremely skeptical about the cost figures. It seems like there are a few zeros missing, but yes, this is a good model and we need to make a better one. We will do it this way.'.


DeepSeek claims that it can train its advanced artificial intelligence model R1 with only $6 million and 2048 GPUs, which is highly cost-effective compared to OpenAI's o1 and other models, shocking Silicon Valley.


However, a report from SemiAnalysis questions this claim, showing that DeepSeek actually spent up to $1.6 billion on hardware and owns 50000 Nvidia Hopper GPUs.

Although DeepSeek acknowledges that its reported costs only cover official training and not previous research or experiments, new findings suggest that DeepSeek's AI development costs are much higher than initially proposed.


Although the cost-effectiveness of DeepSeek is still controversial, its rapid progress in the field of artificial intelligence has made it a strong competitor in the industry.


Source: Free Finance