Apfel
Free CLI for Apple's on-device LLM — no API key, no downloads, runs on macOS
Apfel is an open-source command-line tool that unlocks Apple's built-in Foundation Model (shipped with macOS Tahoe) via a clean CLI, an OpenAI-compatible local server on port 11434, and an interactive chat mode. No model download, no API key, no configuration — if you're on Apple Silicon running macOS Tahoe, the model is already there. The OpenAI-compatible server mode is the clever move: any tool built on the OpenAI SDK can point at localhost:11434 and use Apple's on-device ~3B model for free, with complete privacy. The MCP support adds external tool-calling, making it genuinely useful for shell automation, text transformation, and local agent workflows. The honest constraints: 4,096-token context (~3,000 words) and mixed 2-bit/4-bit quantization mean this isn't a replacement for cloud models on hard tasks. But for scripting, classification, summarization, and quick transformations — all offline, all private, all free — Apfel makes the underutilized neural engine on every Mac actually accessible.
Panel Reviews
The Builder
Developer Perspective
“OpenAI-compatible server on localhost means I can prototype automations and scripts against a real LLM without paying for API calls or waiting on rate limits. The pipe-friendly CLI with proper exit codes is exactly what shell scripting needs. For Mac-native tooling, this is a genuine gap-filler.”
The Skeptic
Reality Check
“A 4,096-token context and ~3B quantized model will fail on anything non-trivial — complex coding, factual recall, multi-step reasoning. You'd still reach for Claude or GPT-4 for real work, making this a toy for most professional use cases. Also, it only runs on macOS Tahoe, which dramatically limits adoption right now.”
The Futurist
Big Picture
“Every Apple Silicon Mac now ships with a neural engine and a capable on-device LLM — Apfel is just the first tool to make that accessible via standard interfaces. This is a preview of the world where local models handle routine tasks completely off the network, with cloud models reserved for genuinely hard inference.”
The Creator
Content & Design
“Quick summaries, translation, text classification without pasting anything into a cloud service — the privacy angle alone is worth it for sensitive client work. MCP support means I can hook it into my local creative workflows. The zero-config setup removed every excuse I had not to try it.”
Community Sentiment
“Zero-config local LLM on Apple Silicon”
“OpenAI-compatible server mode”
“Privacy and offline-first AI”