A scalable, non-blocking job aggregation engine.
Building a job board sounds simple until you have to scale it. Job Radar isn't just a scraper; it's an architectural reference for how to build high-concurrency Python backends.
I built this to solve the "blocking problem." Most simple scrapers halt the entire API while fetching data. Job Radar uses a distributed producer-consumer architecture:
- FastAPI handles the request instantly.
- Celery & Redis handle the heavy lifting (scraping/normalizing) in the background.
- Next.js renders the results in a snappy, client-side dashboard.
Core Tech: Python 3.12, FastAPI, PostgreSQL, Celery, Redis, Next.js 14.
This project uses a heavily customized Makefile to abstract away the complexity of Docker networking and volume management.
# Clone & Enter
git clone [https://github.com/OneBuffaloLabs/job-radar.git](https://github.com/OneBuffaloLabs/job-radar.git)
cd job-radar
# Boot the System
# (Starts Docker, builds containers, runs migrations, starts Redis/Celery)
make up
# Verify it's alive
# API should be at http://localhost:8000
make logs
The UI is a separate Next.js app in the ui/ folder.
cd ui
npm install
npm run dev
# Open http://localhost:3000
Unlike basic CRUD apps, this system is event-driven. Jobs don't appear until you trigger an ingestion event.
- Trigger: You hit the "Ingest" button (or API endpoint).
- Queue: The API returns a
202 Acceptedimmediately. It doesn't wait. - Process: A Celery worker wakes up, fetches data from sources (like Remotive), normalizes the JSON, and upserts it into Postgres.
- View: The Frontend polls the read-optimized API endpoints to display new jobs.
Manual Trigger:
curl -X POST http://localhost:8000/api/jobs/ingest
I wrote a Makefile so I wouldn't have to remember Docker Compose flags.
| Command | What it does |
|---|---|
make up |
Starts the entire stack (API, DB, Redis, Worker). |
make down |
Stops containers and shuts down Docker Engine to save battery. |
make logs |
Tails the API logs. |
make worker-logs |
Tails the Celery worker (Best for debugging scraping). |
make shell |
Drops you into a bash shell inside the running container. |
app/core/celery_app.py: The background worker configuration.app/services/ingestion.py: The business logic for fetching and cleaning data.ui/app: Next.js App Router structure.docker-compose.yml: Orchestrates the 4 services (Web, Db, Redis, Worker).
Built by Andrew Elbaneh @ One Buffalo Labs. License MIT