Cohere Labs has released Tiny Aya, a family of small multilingual language models with 3.35 billion parameters. The goal is to bring AI to languages often underserved by existing models.
Key Features
- Multilingualism: Supports over 70 languages, with a focus on lower-resource languages.
- Small Size: With 3.35 billion parameters, Tiny Aya is designed for efficient deployment even with limited compute resources.
- License: Released under the CC-BY-NC license, which requires adherence to Cohere Labs' Acceptable Use Policy.
- Context Length: 8K input.
Intended Use
Tiny Aya is intended for applications such as multilingual text generation, conversational AI, summarization, translation, and cross-lingual tasks. It is also suitable for research in the field of multilingual natural language processing and modeling of low-resource languages.
Strengths
The model demonstrates good quality in text generation across all supported languages, with particularly high performance in lower-resource languages. It performs well in translation, summarization, and cross-lingual tasks, thanks to the shared training between language families and different scripts.
Limitations
The best performance is found in text generation and conversational tasks. More complex reasoning tasks, such as multilingual mathematics, are more difficult. As with any language model, outputs may contain incorrect or outdated information, especially in languages with fewer resources and less training data. The handling of cultural nuances, sarcasm, or figurative language may be less reliable in these languages.
๐ฌ Comments (0)
๐ Log in or register to comment on articles.
No comments yet. Be the first to comment!