Back to reviews
GuppyLM

GuppyLM

A 9M-param fish LLM that teaches you how transformers actually work

GuppyLM is a deliberately tiny language model — 9 million parameters, 6 transformer layers — that roleplays as a fish and can be fully trained in under 5 minutes on a free Google Colab T4 GPU. The entire pipeline from data generation to training loop to inference fits in approximately 130 lines of PyTorch, making it the most compressed end-to-end LLM tutorial available. Unlike educational projects that paper over complexity with abstraction layers, GuppyLM deliberately avoids modern optimizations — no RoPE positional encoding, no grouped-query attention, no SwiGLU activations. You see exactly why each component exists when you remove it. It ships with a 60,000-example synthetic conversation dataset and produces coherent (if goofy) fish-themed responses after training. The project hit the top of Hacker News Show HN with 365 points and 31 comments. Developers praised how the simplicity forces you to confront how training data shapes model behavior directly, with multiple commenters saying it's the clearest path from 'I know Python' to 'I understand why LLMs work.'

Panel Reviews

The Builder

The Builder

Developer Perspective

Ship

130 lines from raw data to inference — I've never seen a more honest on-ramp to transformer internals. The deliberate omission of RoPE and SwiGLU forces you to understand the delta between vanilla and modern architectures. Assign this to every junior ML engineer before they touch Hugging Face.

The Skeptic

The Skeptic

Reality Check

Skip

This is education, not tooling — calling it a 'language model' is generous for something that outputs fish puns. The synthetic training data is simplistic and the architecture is years behind real LLMs. Fine for learning, but don't confuse novelty with utility.

The Futurist

The Futurist

Big Picture

Ship

The best thing about GuppyLM is that it normalizes building your own models from scratch. As AI democratizes, the next generation of builders needs to understand transformers at the implementation level — not just prompt them. This is exactly the kind of artifact that spawns a thousand domain-specific tiny models.

The Creator

The Creator

Content & Design

Ship

A fish that learned to talk about water from 60K synthetic conversations is unexpectedly charming. The project has a clear personality and a memorable hook — it's the kind of thing that goes viral in classrooms because students actually want to run it. Clever branding for an educational tool.

Community Sentiment

Overall815 mentions
77% positive17% neutral6% negative
Hacker News365 mentions

Most accessible transformer tutorial seen; trains in 5 minutes on free Colab

Reddit200 mentions

Great for teaching, surprised by how coherent outputs are

Twitter/X250 mentions

Viral among ML educators sharing it as a starting point