Did this land?

How to Use LM Studio From Your Android Phone in 2026

LM Studio runs large models on your Mac or PC with a polished interface. Models too large for your phone - Llama 3.1 70B, DeepSeek, Mistral Large - run on your desktop and stream to your phone over WiFi.


What you need

  • Mac or Windows PC running LM Studio with a model loaded
  • Android phone with Off Grid installed
  • Both devices on the same WiFi network

Step 1 - Start LM Studio’s local server

  1. Open LM Studio
  2. Load a model (click the model dropdown at the top)
  3. Go to the Local Server tab (left sidebar)
  4. Click Start Server
  5. Enable “Allow connections from network” in the server settings
  6. Note the displayed port (default: 1234)

Step 2 - Find your computer’s local IP

macOS: System Settings → Network → Wi-Fi → Details → IP address (e.g. 192.168.1.55)

Windows: Open PowerShell → ipconfig → look for IPv4 Address under your Wi-Fi adapter


Step 3 - Connect from Off Grid

  1. Open Off Grid → SettingsRemote Servers
  2. Tap Add Server
  3. Enter: http://192.168.1.55:1234 (use your computer’s actual IP)
  4. Tap Test Connection → should show green
  5. Tap Save

Off Grid automatically discovers models available on the server.


Step 4 - Select a model and chat

Open the model picker in Off Grid. Your LM Studio models appear under the server name. Tap one to make it active and start chatting.

Responses stream in real time via SSE - the same way LM Studio’s own interface works.


Using Tailscale for access outside home

Install Tailscale on both your computer and phone. Use your computer’s Tailscale IP instead of the local IP. You can now access LM Studio from anywhere - office, travel, anywhere with a data connection.