
China Maps AI Dominance in $70B Five-Year Plan
China's National People's Congress opens this week with a 15th Five-Year Plan that puts $70 billion in semiconductor subsidies and AI-plus manufacturing at the center of its tech race with the West.

China's National People's Congress opens this week with a 15th Five-Year Plan that puts $70 billion in semiconductor subsidies and AI-plus manufacturing at the center of its tech race with the West.

A pre-release comparison of DeepSeek V4 and Claude Opus 4.6 - the open-weight challenger that could match Opus on coding at potentially 89x lower output cost.

Two Chinese open-weight trillion-parameter MoE models with ~32B active parameters each - DeepSeek V4 bets on cost and context, Kimi K2.5 bets on Agent Swarm and verified benchmarks.

A pre-release comparison of DeepSeek V3.2 and V4 - examining the generational leap from 671B text-only to a trillion-parameter natively multimodal model with 1M context.

DeepSeek V4 is an unreleased trillion-parameter MoE model with ~32B active parameters, native multimodal capabilities, a 1M-token context window, and optimization for Huawei Ascend chips - expected in the first week of March 2026.

DeepSeek will release V4, a natively multimodal trillion-parameter model with a 1M token context window, in the first week of March - optimized for Huawei Ascend chips, not Nvidia.