# Mistral releases Small 4, a 119B-parameter open-weight model

_Monday, March 16, 2026 at 9:27 PM EDT · AI · Latest · Tier 2 — Notable_

Mistral released Small 4, a 119-billion-parameter open-weight model that uses mixture-of-experts architecture with approximately 6 billion active parameters per inference pass. The model is available for download and commercial use.

Small 4 is designed to run efficiently on consumer hardware while maintaining competitive benchmark performance. The mixture-of-experts approach activates only a fraction of the model's parameters for each query, reducing compute requirements.

The release continues Mistral's strategy of publishing competitive open-weight models that challenge closed-source offerings from OpenAI and Anthropic.

## Sources

- [Simon Willison](https://simonwillison.net/2026/Mar/16/mistral-small-4/)

---
Canonical: https://techandbusiness.org/newswire/Xx2NJhLQJqQVPtk0mDT77G
Retrieved: 2026-05-09T22:21:21.877Z
Publisher: Tech & Business (techandbusiness.org)
