Project: Pocket Coder (Offline Microsoft Phi on Android)
Turn your old Android phone into a private, offline coding companion.
This guide will show you how to run Microsoft Phi-1.5, a famous 1.3 billion parameter model trained specifically on "textbook quality" data, directly on your Android device using Termux. Unlike chatty models, this one is built for code and logic. No internet required. 100% Private.
Prerequisites
- Android Phone: Any decent Android phone (min 4GB RAM recommended).
- Termux App: A terminal emulator for Android.
- Recommended: Download from F-Droid (The Play Store version is outdated).
Step-by-Step Installation
1. Install & Update Termux
Open Termux and run the following command to ensure your package lists are up to date.
pkg update && pkg upgrade
2. Install Ollama
Termux now has an official package for Ollama, making installation very easy.
pkg install ollama
Note: If you see an error saying the package is missing or "no such file," try running
pkg reinstall ollamato fix broken installs.
3. Start the AI Server
Ollama needs a background server to handle the AI logic. Run this command to start it in the background:
ollama serve &
- Tip: If you see logs appear, just press
Enteronce to get your command prompt back. - Important: You must run this command every time you open Termux before using the AI.
Running Microsoft Phi
We will use the Phi-1.5 model. This model is unique because it was trained on code and textbooks rather than the entire internet, making it surprisingly smart at Python despite its tiny size.
- View Phi tags: ollama.com/library/phi
- Explore all models: ollama.com/library
For Standard Phones (Recommended)
Use the 1.3 Billion parameter version (Phi-1.5). It is lightweight and perfect for older devices.
Note: Since the official library defaults to newer models, we use a reliable community version for the classic 1.5.
ollama run tkdkid1000/phi-1_5
For Mid-Range Phones (6GB+ RAM)
If you have a slightly better device, you can upgrade to Phi-2. It is significantly smarter at reasoning while still being efficient.
# 2.7 Billion parameters (The "Sweet Spot" for logic)
ollama run phi:2.7b
Troubleshooting
Error: "exec: "serve": executable file not found in $PATH"
- Fix: This is a common Termux bug where the system cannot find the internal server command. Run this one-time fix to create a system link:
ln -s $(which ollama) $PREFIX/bin/serve
Error: "manifest not found"
- Fix: This means the specific model tag isn't found. Search ollama.com for "phi-1.5" to find the latest active community tag.
Error: "Connection refused" or "Address already in use"
- Fix: The background server isn't running or is stuck. Kill the old process and restart it:
pkill ollama
ollama serve &
The model is hallucinating code!
- Fix: Phi-1.5 is an older base model. For complex coding tasks, try the newer (but heavier) Phi-3 Mini if your phone can handle it:
ollama run phi3
Why do this?
- Focused Intelligence: Unlike generic chat bots, Phi is specialized for code and logic.
- Zero Latency: Coding assistance without waiting for API calls.
- Battery Friendly: The 1.3B model is small enough that it won't drain your battery as fast as massive 7B models.