
Reviews
DeepSeek V3.2 Review: GPT-5 Performance at a Fraction of the Cost
A thorough review of DeepSeek V3.2, the 671B parameter MoE model that delivers frontier-level performance at dramatically lower cost with an MIT license.

A thorough review of DeepSeek V3.2, the 671B parameter MoE model that delivers frontier-level performance at dramatically lower cost with an MIT license.

DeepSeek releases V3.2 under MIT license with 671B MoE architecture, matching GPT-5 at one-tenth the cost and achieving gold-medal performance on IMO and IOI competitions.