Компактная версия Aya Expanse 8B предлагает впечатляющую мультиязычную мощь в более легком формате. Модель оптимизирована для работы с 23 языками и идеально подходит для разработчиков, которым нужна эффективная и быстрая нейросеть для перевода и генерации текстов с открытыми весами.
Today, Cohere For AI, Cohere’s research arm, is proud to announce Aya Expanse, a family of highly performant multilingual models that excels across 23 languages and outperforms other leading open-weights models. We are releasing Aya Expanse as both 8 and 32 billion open-weights models available on Kaggle and Hugging Face, as part of our continued commitment to multilingual research and to accelerate the frontier for multilingual AI. The 8 billion parameters model makes breakthroughs more accessible to researchers worldwide, and our 32 billion parameters model offers state-of-the-art multilingual capabilities. Aya Expanse marks an important step to expand high-quality coverage of languages in LLMs. Since we first launched the Aya initiative two years ago, we have collaborated with over 3,000 researchers from 119 countries to expand cutting-edge multilingual research. This included releasing the Aya collection, the largest multilingual dataset collection to-date, with 513 million examples, and critical evaluation sets for multilingual performance and safety. It has also included the release of Aya-101, the most comprehensive multilingual model to-date covering 101 languages.