Back to reviews
Tiny Aya

Tiny Aya

3B-parameter open model supporting 70+ languages — runs offline on a phone

Tiny Aya is a family of open-weight small language models from Cohere Labs designed to bring multilingual AI to devices that can't access cloud inference. The 3.35B parameter models cover 70+ languages including many lower-resourced ones — African languages, South Asian languages, and Asia-Pacific languages that larger multilingual models either skip or handle poorly. The family includes five variants: a base pretrained model, a globally balanced instruction-tuned version (Global), and three region-specific models — Earth (Africa/West Asia), Fire (South Asia), and Water (Asia-Pacific/Europe). The region-specific models are tuned on data distributions that reflect the linguistic needs of each geography, rather than averaging across all languages and underserving everyone. On the leaderboard for Product Hunt's April 5th, Tiny Aya landed in the top three despite being a research release rather than a commercial product. The models run on Ollama, are available on HuggingFace and Kaggle, and were trained on 64 H100 GPUs — a comparatively modest run for this level of multilingual coverage.

Panel Reviews

The Builder

The Builder

Developer Perspective

Ship

Ollama support means this is running locally in ten minutes. The region-specific variants are a smart design choice — a model tuned for South Asian languages will outperform a globally averaged model on those languages even at smaller parameter counts. This is the right architecture for the problem.

The Skeptic

The Skeptic

Reality Check

Skip

3B parameters across 70+ languages means the average per-language capacity is thin. For high-resource languages like English, Spanish, or Mandarin, you're getting a model that's clearly behind purpose-built alternatives. The compelling use case is low-resource languages — but that's a narrow market compared to the general-purpose SLM space.

The Futurist

The Futurist

Big Picture

Ship

The 5 billion people who don't speak English as a first language are the next wave of AI users — and they'll largely be on mobile, offline-capable devices. Tiny Aya is building the infrastructure for that wave. The region-specific model design suggests Cohere Labs is thinking seriously about this rather than treating multilingual support as a checkbox.

The Creator

The Creator

Content & Design

Ship

For content creators working in non-English markets, an offline model that actually handles your language well is transformational. Offline translation and transcription with no API costs or data privacy concerns is a real workflow unlock — especially for creators in regions with unreliable connectivity.

Community Sentiment

Overall600 mentions
74% positive18% neutral8% negative
Hacker News120 mentions

Low-resource language coverage and on-device viability

Reddit200 mentions

Region-specific model variants and Ollama support

Twitter/X280 mentions

70-language offline capability on phone hardware