Aya Expanse 32B — это высокопроизводительная мультиязычная модель от Cohere, демонстрирующая превосходные результаты на 23 языках. Эта open-weights нейросеть обходит многих конкурентов в задачах машинного перевода и генерации контента, делая продвинутый ИИ доступным для глобального сообщества.
Today, Cohere For AI, Cohere’s research arm, is proud to announce Aya Expanse, a family of highly performant multilingual models that excels across 23 languages and outperforms other leading open-weights models. We are releasing Aya Expanse as both 8 and 32 billion open-weights models available on Kaggle and Hugging Face, as part of our continued commitment to multilingual research and to accelerate the frontier for multilingual AI. The 8 billion parameters model makes breakthroughs more accessible to researchers worldwide, and our 32 billion parameters model offers state-of-the-art multilingual capabilities. Aya Expanse marks an important step to expand high-quality coverage of languages in LLMs. Since we first launched the Aya initiative two years ago, we have collaborated with over 3,000 researchers from 119 countries to expand cutting-edge multilingual research. This included releasing the Aya collection, the largest multilingual dataset collection to-date, with 513 million examples, and critical evaluation sets for multilingual performance and safety. It has also included the release of Aya-101, the most comprehensive multilingual model to-date covering 101 languages.