This project will allow you to export the Traffic Flows (Blocked & Threats) from your Unifi Console either:
- to a CSV for ingestion into other platforms - i.e. your SIEM, MongoDB to visualize with Grafana, etc.
- Run in api-server mode and allow you to directly query against it, backed by an InfluxDB which has direct Grafana support (includes a dashboard with GeoIP enrichment!)
- π Automates login to Unifi Controller
- π Navigates to the Insights page
- π₯ Downloads CSV report with configurable time range
- π‘οΈ Optionally downloads threat data
- π§ Configurable via environment variables
- π Handles self-signed certificates for local controllers
- π’ Built with: Node.js 22.x and Bun 1.9.x
β οΈ While this project is likely to work with older/different versions, compatibility is not guaranteed.
This tool has been tested and verified to work on:
- Windows: Windows 10/11
- Linux: Ubuntu, Debian
- macOS: x86, ARM64, Apple M-series
Tested and verified with the following Unifi versions:
- Network: 9.1.118
- UniFi OS: 4.2.8
βΉοΈ The tool may work with other versions, but these are the ones explicitly tested.
- Clone this repository
- Install dependencies:
# With npm
npm install
# Or with Bun
bun install- Copy the example environment file and update with your credentials:
cp .env.example .env
# Edit .env with your detailsConfigure the script by editing the .env file with your:
- π€ Unifi Controller credentials
- π Controller URL
- β±οΈ Time range for report (THIRTY_MINUTES, HOUR, DAY, WEEK, MONTH)
- π Download location
- π₯οΈ Browser settings (headless mode, etc.)
UNIFI_USERNAME=playwright
UNIFI_PASSWORD=your_password
UNIFI_URL=https://<ip.of.your.unifi.controller.or.cloudkey>
TIME_RANGE=HOUR
DOWNLOAD_THREATS=false
Before using this tool, you should create a dedicated user in your Unifi Controller:
- Click on Settings -> Admins & Users
- Click Admins
- Click Create New Admin
- Check the "Restrict to Local Access Only" box
- Username:
playwright - Password:
<secure password> - Select "Use a Predefined Role"
- Role: Super Admin
- Click Create
The application can run in two modes:
This mode downloads CSV files to disk for manual processing or importing into other systems.
# Run the downloader only
bun run download
# Or use the npm script
npm run downloadThis mode runs a complete stack with API server, InfluxDB for storage, and GeoIP enrichment.
# Start only the API server (requires InfluxDB running separately)
bun run api
# Start the complete application (downloader + importer + API server)
bun run start:all
# Or use Docker Compose to run the entire stack (recommended)
docker-compose up -dSet DOWNLOAD_THREATS=true in your .env file to also download a second CSV file with threats data.
Set DIRECT_IMPORT=true in your .env file to skip saving CSV files and import directly to InfluxDB.
For automated regular downloads, you can use:
Add a cron job:
# Example: Run daily at 2 AM
0 2 * * * cd /path/to/unifi-flows && node src/unifi-downloader.js >> logs/downloads.log 2>&1Create a batch file run-download.bat:
cd C:\path\to\unifi-flows
node src\unifi-downloader.jsThen set up a scheduled task to run this batch file.
- π Login Issues: Verify your credentials in the .env file
- π Selector Issues: The script may need updates if Unifi UI changes
- π Debug Mode: Set
HEADLESS=falseandSLOW_MO=50in .env to watch the automation in action - π SSL Errors: For local controllers with self-signed certificates, ensure
IGNORE_HTTPS_ERRORS=trueis set - πΊοΈ Geomap Issues: If you don't see data on the world map, verify that your data contains valid latitude/longitude coordinates
This project can store data directly in InfluxDB for better performance and direct Grafana integration.
The easiest way to get started with the full stack (InfluxDB, Grafana, and optionally the application itself) is to use Docker Compose:
# Start the entire stack
docker-compose up -d
# To also build and run the application in Docker
docker-compose up -d --buildThis will:
- Start InfluxDB on port 8086
- Start Grafana on port 3000
- Optionally build and run the application (uncomment the relevant section in docker-compose.yml)
Once the stack is running, you can access the various services at these URLs:
- URL: http://localhost:3000
- Default Credentials:
- Username:
admin - Password:
admin
- Username:
- Dashboards: After logging in, you can import the dashboard from
grafana-dashboard.json
- Navigate to http://localhost:3000 and log in with admin/admin
- Go to Dashboards β Import
- Either:
- Upload the
grafana-dashboard.jsonfile, or - Copy and paste the contents of the file
- Upload the
- Select your InfluxDB data source
- Click Import
The dashboard includes:
- Network traffic volume over time
- Top protocols and applications
- Geographic traffic visualization
- Threat monitoring panels
influxdb:8086 as the URL, not localhost:8086. This is because in Docker networking, containers refer to each other by service name.
Configuration steps:
- In Grafana, go to Configuration β Data Sources β Add data source
- Select "InfluxDB"
- Use these settings:
- URL:
http://influxdb:8086(must use service name, not localhost) - Query Language: Flux
- Organization: unifi-flows
- Token: my-super-secret-auth-token
- Default Bucket: network-data
- URL:
- Click "Save & Test"
- URL: http://localhost:3001/api
- Documentation:
- RapiDoc UI: http://localhost:3001/api/docs
- Swagger UI: http://localhost:3001/api/docs/swagger
- OpenAPI Spec: http://localhost:3001/api/openapi.json
- Health Check: http://localhost:3001/api/health
- URL: http://localhost:8086
- Default Credentials:
- URL
- Query Language:
Flux - Username:
admin - Password:
password123 - Organization:
unifi-flows - Bucket:
network-data - Token:
my-super-secret-auth-token
Note: For production use, you should change all default passwords in the docker-compose.yml and .env files.
If you prefer to set up components individually:
# Start InfluxDB container
docker run -d --name influxdb \
-p 8086:8086 \
-v influxdb-data:/var/lib/influxdb2 \
-v influxdb-config:/etc/influxdb2 \
-e DOCKER_INFLUXDB_INIT_MODE=setup \
-e DOCKER_INFLUXDB_INIT_USERNAME=admin \
-e DOCKER_INFLUXDB_INIT_PASSWORD=password123 \
-e DOCKER_INFLUXDB_INIT_ORG=unifi-flows \
-e DOCKER_INFLUXDB_INIT_BUCKET=network-data \
-e DOCKER_INFLUXDB_INIT_ADMIN_TOKEN=my-super-secret-auth-token \
influxdb:2.7
# To view the UI
# Visit http://localhost:8086 in your browser
# Login with admin/password123Set the following in your .env file:
USE_INFLUXDB=true
INFLUXDB_URL=http://localhost:8086
INFLUXDB_TOKEN=my-super-secret-auth-token
INFLUXDB_ORG=unifi-flows
INFLUXDB_BUCKET=network-data
DIRECT_IMPORT=false
Set DIRECT_IMPORT=true to skip saving CSV files and import directly to InfluxDB.
-
Install Grafana:
# Run Grafana container docker run -d --name grafana \ -p 3000:3000 \ -v grafana-storage:/var/lib/grafana \ grafana/grafana:latest -
Configure InfluxDB Data Source:
- Open Grafana at http://localhost:3000 (default login: admin/admin)
- Go to Configuration > Data Sources > Add data source
- Select InfluxDB
- Set URL to
http://influxdb:8086(or usehttp://localhost:8086if not using Docker networking) - In the Auth section, set your Organization, Token, and Default Bucket
- Test connection and Save
-
Import the Dashboard:
- Go to Dashboards > Import
- Copy the contents of
grafana-dashboard.jsonfile - Click "Load" and then "Import"
- You should now see the Unifi Network Traffic Dashboard with your data
The dashboard uses Grafana's native Geomap Panel for geographical visualization. No additional plugins are required.
This project can enrich your network traffic data with geolocation information for better visualization and analysis using Grafana's native Geomap Panel.
The application uses free geolocation APIs to look up IP address information:
-
Primary: ipapi.co
- 30,000 lookups per month
- Comprehensive geolocation data
- No API key required for basic usage
-
Secondary (Fallback): ip-api.com
- 45 lookups per minute
- Used as fallback if ipapi.co rate limit is reached
- No API key required for basic usage
The system implements intelligent caching to minimize API calls:
- In-memory cache for fast lookups
- File-based cache with 30-day TTL
- Rate limiting to respect API service limits
The application will:
- Look up the geographical location of source and destination IP addresses
- Add country, city, latitude, longitude, and ISP information to the data
- Store this enriched data in InfluxDB
- Visualize the traffic on a world map in Grafana
This allows you to:
- See where your network traffic is coming from and going to
- Identify traffic patterns by country or region
- Detect unusual connections to unexpected locations
The project includes a RESTful API with comprehensive documentation:
- RapiDoc UI:
/api/docs- Interactive API documentation with modern UI - Swagger UI:
/api/docs/swagger- Traditional Swagger interface - OpenAPI Spec:
/api/openapi.json- Raw OpenAPI 3.1.0 specification
- System: Health check and system status
- Flows: Query network traffic flow data
- Threats: View detected threats
- Metrics: Get traffic metrics and statistics
- Import: Trigger data imports
All protected endpoints require an API key that can be set in your .env file:
API_KEY=your-secure-key-here
If the UI changes and the script stops working, you can re-record the automation:
- Run the recording script:
./record.sh- Perform the steps in the browser to download the CSV
- The generated code will be saved to
generated-script.js - Update
src/unifi-downloader.jswith the new selectors
MIT