AI LATEST
Mistral releases Small 4, a 119B-parameter open-weight model
Mistral released Small 4, a 119-billion-parameter open-weight model that uses mixture-of-experts architecture with approximately 6 billion active parameters per inference pass. The model is available for download and commercial use.
Small 4 is designed to run efficiently on consumer hardware while maintaining competitive benchmark performance. The mixture-of-experts approach activates only a fraction of the model's parameters for each query, reducing compute requirements.
The release continues Mistral's strategy of publishing competitive open-weight models that challenge closed-source offerings from OpenAI and Anthropic.
Sources