Project: Pocket AI (Offline DeepSeek on Android)
Turn your old Android phone into a private, offline AI powerhouse.
This guide will show you how to run DeepSeek-R1, a powerful reasoning (thinking) AI model, directly on your Android device using Termux. No internet required after the initial download. No data tracking. 100% Private.
Prerequisites
- Android Phone: Any decent Android phone (min 4GB RAM recommended).
- Termux App: A terminal emulator for Android.
- Recommended: Download from F-Droid (The Play Store version is outdated).
Step-by-Step Installation
1. Install & Update Termux
Open Termux and run the following command to ensure your package lists are up to date.
pkg update && pkg upgrade
2. Install Ollama
Termux now has an official package for Ollama, making installation very easy.
pkg install ollama
Note: If you see an error saying the package is missing or "no such file" later on, try running
pkg reinstall ollamato fix broken installs.
3. Start the AI Server
Ollama needs a background server to handle the AI logic. Run this command to start it in the background:
ollama serve &
- Tip: If you see logs appear, just press
Enteronce to get your command prompt back. - Important: You must run this command every time you open Termux before using the AI.
Running DeepSeek
We will use the DeepSeek-R1 Distill model. This is a highly efficient version of the massive DeepSeek model, optimized to run on smaller devices without losing its reasoning capabilities.
- View DeepSeek R1 tags: ollama.com/library/deepseek-r1
- Explore all models: ollama.com/library
For Standard Phones (Recommended)
Use the 1.5 Billion parameter version. It is fast, snappy, and works on most phones.
ollama run deepseek-r1:1.5b
For High-End Phones (8GB+ RAM)
If you have a powerful device (Snapdragon 8 Gen 2/3), you can try the larger, smarter versions:
# 7 Billion parameters (Slower, but smarter)
ollama run deepseek-r1:7b
# 8 Billion parameters (Llama-3 based)
ollama run deepseek-r1:8b
Troubleshooting
Error: "Connection refused" or "Could not connect to running instance"
- Fix: This means the background server isn't running. Run
ollama serve &again.
**Error: "No such file or directory" when running ollama**
- Fix: Your installation might be corrupted. Run
pkg reinstall ollama.
The model is too slow!
- Fix: Try a smaller model.
tinyllamais the fastest option available:
ollama run tinyllama
Why do this?
- Privacy: Your chats never leave your device.
- Offline Access: Works in an airplane, a bunker, or a forest.
- Hardware Reuse: Give that old phone in your drawer a new purpose as a dedicated AI assistant.