How to Use Ollama From Your Android Phone in 2026
Ollama lets you run large language models on your desktop. Models that are too big for your phone - Llama 3.1 70B, Mistral Large, CodeLlama 34B - can run on your desktop and be accessed from your phone over your home network.
What you need
- Desktop or laptop running Ollama with at least one model loaded
- Android phone with Off Grid installed
- Both devices on the same WiFi network (or Ollama accessible via VPN/Tailscale)
Step 1 - Configure Ollama to accept remote connections
By default Ollama only listens on localhost. To accept connections from your phone:
macOS / Linux:
OLLAMA_HOST=0.0.0.0 ollama serve
Or set it as a permanent environment variable:
# ~/.zshrc or ~/.bashrc
export OLLAMA_HOST=0.0.0.0
Windows: Set OLLAMA_HOST=0.0.0.0 as a system environment variable and restart Ollama.
Step 2 - Find your desktop’s local IP
macOS: System Settings → Network → your WiFi connection → IP address (e.g. 192.168.1.42)
Windows: ipconfig in terminal → IPv4 address under your WiFi adapter
Linux: ip addr show - look for your WiFi interface
Step 3 - Connect from Off Grid
- Open Off Grid → Settings → Remote Servers
- Tap Add Server
- Enter:
http://192.168.1.42:11434(replace with your desktop’s IP) - Tap Test Connection - it should show green
- Tap Save
Step 4 - Select a model and chat
- Open the model picker
- You’ll see models loaded on your Ollama server listed under Remote
- Select one and start chatting
Your queries go from your phone → your desktop → back to your phone. Nothing touches the internet.
Using Tailscale for access outside your home
If you want to use Ollama from your phone while away from home, Tailscale creates a private VPN between your devices. Install it on both your desktop and phone, then use the Tailscale IP of your desktop instead of the local one.
FAQ
Can I use Ollama from my phone without internet? Yes - over local WiFi only. For remote access you need Tailscale or a similar VPN.
Which Ollama models work best from a phone?
Any model loaded on your desktop works. llama3.1:70b and mistral-large are popular choices since they’re too large to run locally on a phone.