A Burp Suite extension that provides AI-powered security testing using local Ollama models instead of PortSwigger's paid Burp AI service.
- Ask Ollama – Right-click on selected text in Repeater, Proxy, or Decoder → "Ask Ollama" to get security-focused explanations
- Ollama tab – Dedicated AI panel in Repeater with conversation history
- Scanner integration – Right-click Scanner findings → "Ask Ollama" → Analyze or Validate false positive
- Login sequence – AI-assisted login flow generation; use as a Session Handling action
- Privacy – All data stays on your machine; nothing is sent to external APIs
- Offline – Works without internet (after models are pulled)
- Configurable – Ollama URL, model selection, timeout, context size, and system prompts
- Burp Suite (Community or Professional)
- Ollama installed and running locally
- At least one model pulled (e.g.
ollama pull llama3.2:3b)
- Build the JAR:
./gradlew jar - In Burp: Extensions > Installed > Add → Select
build/libs/burp-ollama.jar - Configure in Settings > Burp Ollama (Ollama URL, model, etc.)
- Ensure Ollama is running (
ollama serveor the Ollama app) - In Repeater, Proxy, or Decoder, select some text (e.g. an HTTP header, cookie, or JavaScript snippet)
- Right-click → Ask Ollama
- View the AI response in the dialog
| Setting | Default | Description |
|---|---|---|
| Ollama base URL | http://localhost:11434 |
Where Ollama API is running |
| Model | llama3.2:3b |
Model to use (must be pulled in Ollama) |
| Timeout | 120s | Request timeout |
| Context size | 4096 | num_ctx for Ollama |
| Streaming | Off | Stream tokens as they arrive |
| Route via Burp | Off | Send Ollama requests through Burp HTTP API (proxy, TLS) |
| System prompt | (editable) | Default security-focused prompt |
Use Test connection in Settings to verify Ollama is reachable and list available models.
Route via Burp: When Ollama runs on a different host (e.g. remote server), enable this and configure Burp's proxy to reach that host. Requests will go through Burp's HTTP stack (proxy, TLS, upstream proxy). The Advanced prompts section lets you customize prompts for Explain headers, Analyze, Validate false positive, Decipher code, and Generate login.
Right-click on audit issues in the Scanner tab → Ask Ollama:
- Analyze – Exploitation steps and analysis
- Validate false positive – AI evaluation of whether the finding is likely real or a false positive (especially useful for access control / IDOR findings)
Configure a login request template in Settings > Burp Ollama > Login sequence. Use {{username}} and {{password}} as placeholders. Then add Project > Settings > Session handling → "Run session handling action" → Ollama Login. The action sends the login request, extracts cookies from the response, and injects them into outgoing requests. Optionally use Generate from description to have Ollama suggest a raw HTTP request from a text description.
./gradlew jarJAR output: build/libs/burp-ollama.jar
./gradlew testUnit tests include:
- OllamaResponseParser – JSON parsing for tags and chat responses
- OllamaService – HTTP client behavior (mocked with MockWebServer)
- OllamaConfig – Settings persistence
- OllamaAuditIssueFormatter – Scanner finding formatting
- OllamaLoginSessionAction – Login config and template replacement
- SecurityPrompts – Default prompt content
MIT