Open source

Mistral Small 4

Mistral Small 4

Mistral AI's unified MoE model - 119B total parameters, 6B active per token, 128 experts, 256K context, configurable reasoning, Apache 2.0 license.

FLUX.2 [dev]

FLUX.2 [dev]

Black Forest Labs' 32B open-weight image model - the most powerful open alternative for text-to-image, editing, and multi-reference generation with up to 10 reference images.

FLUX.2 [klein] 4B

FLUX.2 [klein] 4B

Black Forest Labs' fastest open-source image generation model - 4B parameters, Apache 2.0 license, sub-second generation on consumer GPUs with 13GB VRAM.

Shopify CEO Uses AI Agent to Make Liquid 53% Faster

Shopify CEO Uses AI Agent to Make Liquid 53% Faster

Tobi Lütke ran Karpathy's autoresearch loop against the Liquid templating engine he created 20 years ago, producing 93 commits from 120 experiments that cut parse+render time by 53% and allocations by 61%.

Italian-Legal-BERT

Italian-Legal-BERT

Italian-Legal-BERT is a 110M-parameter domain-adapted BERT model for Italian legal NLP, trained on 3.7GB of court decisions from Italy's National Jurisprudential Archive.