Яндекс Метрика
Языковая модель

Wu Dao Aquila-33B

Beijing Academy of Artificial Intelligence / BAAI
Чат-ботГенерация кодаГенерация текстаОтветы на вопросыText summarization

Старшая версия в линейке Aquila, которая сочетает в себе глубокое понимание контекста и высокую скорость работы. Этот ИИ отлично справляется с суммаризацией текстов и сложной генерацией кода, предлагая золотую середину между производительностью и размером параметров.

Who said all large-language models (LLMs) necessarily need to be large? In China’s case, LLMs are currently downsizing in their size and number of parameters. According to sources, this is because the country is now focusing on enabling Chinese startups and smaller entities to build their own generative AI applications. As part of this downscaling trend, in June the Beijing Academy of Artificial Intelligence (BAAI) introduced Wu Dao 3.0, a series of open-source LLMs. Based on interviews with high-ranking, anonymous sources involved in the project, IEEE Spectrum can report that Wu Dao 3.0 builds on the academy’s work with Wu Dao 2.0, a sparse, multimodal generative AI model—as has been widely reported about version 2.0—with 1.75 trillion parameters. Although there is no single set of parameters for Wu Dao 3.0 (it’s a range of models with a variety of parameter counts) all are well below the 1.75 trillion high-water mark that version 2.0 set.

Что такое Wu Dao Aquila-33B?+
Кто разработал Wu Dao Aquila-33B?+
Какие задачи решает Wu Dao Aquila-33B?+