Pocket AI: Run DeepSeek Offline on Android

January 15, 2026

Project: Pocket AI (Offline DeepSeek on Android)

Turn your old Android phone into a private, offline AI powerhouse.

This guide will show you how to run DeepSeek-R1, a powerful reasoning (thinking) AI model, directly on your Android device using Termux. No internet required after the initial download. No data tracking. 100% Private.


Prerequisites

  1. Android Phone: Any decent Android phone (min 4GB RAM recommended).
  2. Termux App: A terminal emulator for Android.
    • Recommended: Download from F-Droid (The Play Store version is outdated).

Step-by-Step Installation

1. Install & Update Termux

Open Termux and run the following command to ensure your package lists are up to date.

pkg update && pkg upgrade

2. Install Ollama

Termux now has an official package for Ollama, making installation very easy.

pkg install ollama

Note: If you see an error saying the package is missing or "no such file" later on, try running pkg reinstall ollama to fix broken installs.

3. Start the AI Server

Ollama needs a background server to handle the AI logic. Run this command to start it in the background:

ollama serve &
  • Tip: If you see logs appear, just press Enter once to get your command prompt back.
  • Important: You must run this command every time you open Termux before using the AI.

Running DeepSeek

We will use the DeepSeek-R1 Distill model. This is a highly efficient version of the massive DeepSeek model, optimized to run on smaller devices without losing its reasoning capabilities.

Use the 1.5 Billion parameter version. It is fast, snappy, and works on most phones.

ollama run deepseek-r1:1.5b

For High-End Phones (8GB+ RAM)

If you have a powerful device (Snapdragon 8 Gen 2/3), you can try the larger, smarter versions:

# 7 Billion parameters (Slower, but smarter)
ollama run deepseek-r1:7b

# 8 Billion parameters (Llama-3 based)
ollama run deepseek-r1:8b

Troubleshooting

Error: "Connection refused" or "Could not connect to running instance"

  • Fix: This means the background server isn't running. Run ollama serve & again.

**Error: "No such file or directory" when running ollama**

  • Fix: Your installation might be corrupted. Run pkg reinstall ollama.

The model is too slow!

  • Fix: Try a smaller model. tinyllama is the fastest option available:
ollama run tinyllama

Why do this?

  1. Privacy: Your chats never leave your device.
  2. Offline Access: Works in an airplane, a bunker, or a forest.
  3. Hardware Reuse: Give that old phone in your drawer a new purpose as a dedicated AI assistant.