
Off Grid
The Swiss Army Knife of On-Device AI.
Chat. Generate images. Use tools. See. Listen. All on your phone. All offline. Zero data leaves your device.
What Off Grid does
| Capability | Details |
|---|---|
| Text generation | Llama, Qwen 3, Gemma 3, Phi-4, Mistral and any GGUF model - 15–30 tok/s on flagship devices |
| Image generation | On-device Stable Diffusion - 5–10s on NPU (Snapdragon), Core ML on iOS. 20+ models |
| Vision AI | Point your camera at anything and ask questions. SmolVLM, Qwen3-VL, Gemma 3n |
| Voice input | On-device Whisper speech-to-text. Hold to record, auto-transcribe. No audio leaves your phone |
| Tool calling | Web search, calculator, date/time, device info. Automatic tool loop |
| Document analysis | Attach PDFs, CSVs, code files. Native PDF text extraction on both platforms |
| Remote servers | Connect to Ollama, LM Studio, LocalAI on your home network |
| Works offline | Airplane mode, restricted networks, anywhere |
Why local AI matters
When you run a query on a cloud AI service - ChatGPT, Gemini, Claude - it’s logged on a server. Your prompt, the response, the time, your account. Stored indefinitely. Used to train future models. Subject to law enforcement requests. Readable by employees.
With Off Grid, none of that applies. The model runs in your phone’s memory. Inference happens on your CPU and GPU. Nothing is sent anywhere. Ever.
Get started
Guides
LLMs
- How to Run LLMs Locally on Your Android Phone in 2026
- How to Run LLMs Locally on Your iPhone in 2026
Image Generation
Vision, Voice and Documents
- Vision AI - Analyse Images and Documents On-Device
- Voice Input - On-Device Speech-to-Text with Whisper
- Document Analysis and Attachments
- Knowledge Base and RAG
Tools and Intelligence
Remote Servers
- Remote Servers - Connect Ollama, LM Studio, and LocalAI
- How to Use Ollama From Your Android Phone in 2026
- How to Use LM Studio From Your Android Phone in 2026
Community
Questions, feedback, and feature requests - join the Slack community.
Source code is open - star the repo on GitHub.