Mistral 3 Large — флагманская ИИ-модель с архитектурой Mixture-of-Experts (MoE), предлагающая невероятную мощность для открытых решений. С 675 миллиардами параметров этот AI-гигант задает новые стандарты качества в генерации текста и логическом выводе под лицензией Apache 2.0.
Today, we announce Mistral 3, the next generation of Mistral models. Mistral 3 includes three state-of-the-art small, dense models (14B, 8B, and 3B) and Mistral Large 3 – our most capable model to date – a sparse mixture-of-experts trained with 41B active and 675B total parameters. All models are released under the Apache 2.0 license. Open-sourcing our models in a variety of compressed formats empowers the developer community and puts AI in people’s hands through distributed intelligence.