Autonomous AI agent for Android. No other hardware required.
Built at Sundai Club.
CellClaw runs natively on your phone and can read your screen, send messages, make calls, control apps, take photos, and more — all orchestrated by an LLM through a tool-use loop with a heartbeat.
33 tools · 12 categories · 4 providers · Android 8.0+ · No middleman server
- No middleman — no CellClaw server, your phone talks directly to your AI provider
- Bring your own model — Claude, GPT, Gemini, or any OpenRouter model
- 33 tools, zero scripts — the AI figures out what to do from plain language
- Real Android control — not just chat; it taps, swipes, types, launches apps, reads your screen
- You set the limits — choose how autonomous the AI should be, approve sensitive actions
CellClaw has 33 tools across 12 categories that give the AI deep access to your phone:
| Category | Tools | Examples |
|---|---|---|
| Messaging | SMS, WhatsApp, Telegram, Instagram | "Text mom I'll be late", "Reply to my last WhatsApp" |
| Phone | Calls, call history | "Call the pizza place", "Who called me today?" |
| Screen Control | Read screen, tap, swipe, type, scroll | "Swipe right on Tinder", "Open Settings and turn on Bluetooth" |
| Vision | Screenshots, image analysis | "What's on my screen?", "Read this menu for me" |
| Contacts & Calendar | Search, create events | "When's my next meeting?", "Add John to my contacts" |
| Camera | Photos, video | "Take a photo", "Record a 10 second video" |
| Files | Read, write, list | "Save this to a file", "What's in my Downloads?" |
| Location | GPS coordinates | "Where am I right now?" |
| Browser | Open URLs, web search | "Search for nearby restaurants" |
| Send email | "Email my boss the report" | |
| Sensors | Accelerometer, gyroscope, light, pressure | "What's the ambient light level?" |
| System | Shell commands, notifications, scheduling | "Run ping google.com", "Remind me in 30 minutes" |
The AI decides which tools to use. You just describe what you want in plain language.
- Android 8.0+ (API 26)
- An API key from one of: Anthropic, OpenAI, Google AI Studio, or OpenRouter
- Download the latest APK from Releases
- Enable "Install from unknown sources" if prompted
- Open CellClaw
CellClaw walks you through a 4-step setup:
-
Choose your AI provider — Pick Anthropic, OpenAI, Gemini, or OpenRouter and enter your API key. Keys are encrypted on-device using Android Keystore.
-
Set your name (optional) — Helps personalize responses.
-
Choose an autonomy level — Controls how much the AI can do without asking:
- Full Auto — Everything auto-approved. No interruptions.
- Balanced — Reads are automatic, sends/writes ask first. (Recommended)
- Cautious — Almost everything requires approval.
-
Grant permissions — CellClaw needs three system permissions:
- Display over other apps — For the floating overlay bubble
- Accessibility service — For reading screens and controlling apps
- Notification access — For reading and acting on notifications
Each links directly to the Android settings page where you enable it.
After setup, CellClaw starts a foreground service and you're ready to go.
CellClaw runs an agent loop — a cycle where the AI thinks, calls tools, observes results, and repeats until the task is done:
You: "Open Tinder and swipe right on people over 30"
↓
AI thinks → calls app.launch(tinder)
↓
AI thinks → calls screen.read() → sees the profile
↓
AI thinks → calls screen.capture() + vision.analyze() → reads the age
↓
AI thinks → calls app.automate(swipe_right) or app.automate(swipe_left)
↓
AI thinks → repeats...
The AI decides the strategy. You don't write rules or scripts — it figures out what to do based on what it sees on screen.
CellClaw reads your screen in two ways:
- Accessibility tree (
screen.read) — Fast and free. Returns text, buttons, and UI elements as structured data. Works great for most apps. - Screenshots (
screen.capture+vision.analyze) — Uses the AI's vision capability to understand images, game boards, charts, and canvas-rendered content that isn't in the accessibility tree.
The AI picks the right approach based on the task.
For long-running tasks (playing a chess game, monitoring an app, waiting for a response), CellClaw uses a heartbeat — a periodic check-in that wakes the AI to look at the screen and act if needed.
- Starts at 5-second intervals, backs off to 60 seconds when nothing is happening
- Resets to fast polling when the AI takes action
- Keeps the wake lock alive so Android doesn't kill the process
- Completely invisible — you won't see heartbeat activity in the chat
When the AI wants to do something sensitive (send a message, make a call, execute a script), CellClaw can ask for your approval first. This depends on your autonomy level:
| Action Type | Full Auto | Balanced | Cautious |
|---|---|---|---|
| Read SMS, contacts, screen | Auto | Auto | Auto |
| Launch apps | Auto | Auto | Ask |
| Send SMS, make calls | Auto | Ask | Ask |
| Run shell commands | Auto | Ask | Ask |
Approvals show up in the chat and in the notification. You can tap Approve, Deny, or Always Allow (auto-approves that tool for the session).
- Chat screen — Full conversation interface with tool execution status
- Floating overlay — Tap the bubble for quick questions without leaving your current app
- Notification reply — Reply directly from the persistent notification
- "Explain Screen" button — One-tap screenshot + AI explanation from the notification
- Quick Settings tile — Toggle CellClaw from the notification shade
- Wake word — Hands-free activation (requires microphone permission)
| Provider | Models |
|---|---|
| Anthropic | Claude Sonnet 4.6, Claude Opus 4.6, Claude Haiku 4.5 |
| OpenAI | GPT-5.2, GPT-5.2 Chat Latest, GPT-5 Mini, GPT-4.1, GPT-4.1 Mini |
| Gemini 3.1 Pro, Gemini 3 Flash, Gemini 2.5 Flash/Pro | |
| OpenRouter | Any model available on OpenRouter (Gemini, Claude, GPT, etc.) |
Switch providers and models anytime in Settings.
- API keys are encrypted on-device using Android Keystore (AES-256-GCM). Never stored in plaintext.
- You control what the AI can do via the autonomy level. Sensitive actions can require explicit approval.
- No CellClaw backend — there is no CellClaw server. Your data goes directly from your phone to your chosen AI provider.
- Cloud inference — screen content, messages, and prompts are sent to the AI provider's API for processing. Review your provider's privacy policy (Anthropic, OpenAI, Google).
- Biometric gate available for sensitive operations (fingerprint/face unlock).
- No telemetry, no analytics, no tracking from CellClaw itself.
git clone https://github.com/jordanthejet/cellclaw.git
cd cellclaw
# Build debug APK
./gradlew assembleDebug
# Install on connected device
./gradlew installDebugRequirements:
- Android Studio (Ladybug or newer) or Android SDK command-line tools
- JDK 17
- Android SDK 35
Physical device recommended — the accessibility service doesn't fully work on emulators.
Project Structure
app/src/main/kotlin/com/cellclaw/
├── agent/ # AgentLoop, HeartbeatManager, AutonomyPolicy
├── approval/ # ApprovalQueue, BiometricGate
├── config/ # AppConfig, Identity (system prompt), SecureKeyStore
├── di/ # Hilt dependency injection modules
├── memory/ # Room database, ConversationStore, SemanticMemory
├── provider/ # Anthropic, OpenAI, Gemini, OpenRouter implementations
├── scheduler/ # WorkManager scheduled tasks
├── service/ # Foreground service, Accessibility, Overlay, Notification listener
├── skills/ # Markdown-defined reusable workflows
├── tools/ # All 33 tool implementations
├── ui/ # Jetpack Compose screens and ViewModels
├── voice/ # Text-to-speech
└── wakeword/ # TensorFlow Lite wake word detection
- Create
tools/YourTool.kt:
class YourTool @Inject constructor() : Tool {
override val name = "category.action"
override val description = "What this tool does"
override val parameters = ToolParameters(listOf(
ToolParam("param_name", "string", "Description", required = true)
))
override val requiresApproval = true
override suspend fun execute(params: JsonObject): ToolResult {
// Your implementation
return ToolResult.success(mapOf("result" to "value"))
}
}- Register it in
di/AppModule.kt - Add to
AutonomyPolicy.ktif it requires approval
- Create
provider/YourProvider.ktimplementing theProviderinterface - Handle request/response translation to the provider's API format
- Register in
ProviderManager
- Android app with agent loop and 33 tools
- Multi-provider support (Anthropic, OpenAI, Google, OpenRouter)
- Heartbeat system for long-running tasks
- Wake word activation
- On-device inference (local models)
- iOS app
- Skill sharing and community skill packs
- ZeroClaw — lightweight Rust-based AI assistant infrastructure
Apache 2.0 — see LICENSE