DeepSeek chose Huawei over Nvidia to develop its next AI model
DeepSeek prefers Chinese accelerators for final AI model optimization
*According to Reuters*, in the development of large language models there is a final stage where, before product release, they undergo fine‑tuning on specialized compute accelerators. In the case of DeepSeek’s latest model (V4), the company chose Chinese suppliers – Huawei and other domestic manufacturers – for this work.
Why it matters
According to Reuters analysis, the “right of first marriage night” in the industry has usually belonged to American giants, mainly Nvidia. Until recently, DeepSeek followed this trend: their models were optimized on Nvidia accelerators. However, the new V4 version was tuned for Chinese platforms. A few weeks before release, the company opened access to the model only for domestic accelerator suppliers to fully leverage local hardware infrastructure and improve AI performance.
Optimization speed
Independent experts note that modern tools allow completion of tuning in a matter of weeks, unlike the months required previously. This indicates a high level of readiness of Chinese technologies for mass deployment.
Political context
Analysts believe this priority approach aligns with China’s state policy: locally produced equipment and software receive an advantage over foreign counterparts. Nevertheless, information that DeepSeek used secret data centers on Nvidia Blackwell accelerators to train V4 contradicts the company’s claim of switching to Huawei. U.S. officials fear the firm may conceal the use of Blackwell and officially announce work with Chinese accelerators.
Thus, DeepSeek demonstrates a clear shift toward local technologies while facing challenges of transparency and international competition.
Comments (0)
Share your thoughts — please be polite and stay on topic.
Log in to comment