A FastAPI + Celery + Kafka application for receiving, processing, and monitoring webhooks.
- FastAPI endpoint for receiving webhooks and verifying HMAC signatures
- Kafka integration for event streaming and main processing
- Celery worker for handling processing failures and retries, with DLQ support
- SQLAlchemy/Postgres for persistence and audit logging
- Prometheus/Grafana monitoring for performance and health metrics
- Docker & docker‑compose
- (Optional, for local dev) Python 3.10+ and
pip
- Copy the example:
cp .env.example .env
- Populate
.envwith:DATABASE_URL=postgresql://... KAFKA_BROKER=kafka:9092 CELERY_BROKER_URL=amqp://guest:guest@rabbitmq:5672// RUN_PROM_METRICS=true|false ENV=development|production WEBHOOK_SECRET=<your_webhook_secret>
python db/seed.pydocker-compose up --buildpip install -r requirements.txt
uvicorn app.main:app --reload-
POST
/webhook
Send a webhook event. The endpoint will verify the HMAC, record the event, and asynchronously enqueue the task via Celery (with sharding across multiple queues). -
POST
/customers
Create a customer and get a webhook secret. -
GET
/webhooks
Retrieve processed webhook events for a customer.
- FastAPI: Handles HTTP API requests and webhook validation.
- Celery: Processes webhook tasks asynchronously, with built-in retries and DLQ (Dead Letter Queue) handling.
- Kafka: Streams events downstream and handles failed events via DLQ.
- PostgreSQL: Stores webhook events, audit logs, and customer data.
- Prometheus & Grafana: Monitor application metrics including processing latency, success/failure counts, and overall throughput.
Ensure your .env file (or environment configuration) contains:
DATABASE_URL– PostgreSQL connection string.KAFKA_BROKER– Kafka broker address.WEBHOOK_SECRET– Default webhook secret (used for tests).ENV– Set toproductionin production environments.- Additional variables as needed for RabbitMQ, Prometheus (
RUN_PROM_METRICS), and Celery.
- Run any required migrations using your migration tool (e.g., Alembic) or execute the provided seed scripts to populate customer data and test records.
-
Prometheus Metrics:
Metrics are exposed on/metricsendpoints (configured to work with Prometheus multiprocess mode).
Check Prometheus at http://localhost:9090 and Grafana dashboards for detailed performance data. -
Troubleshooting:
- Verify that environment variables match in the backend and Celery worker.
- Use
docker-compose logsto check for errors in Celery, FastAPI, or Kafka. - If tasks are retried or sent to the DLQ unexpectedly, review your HMAC verification and database logs.
- Validate YAML configurations with
docker-compose config.
- The tests include scenarios for successful webhook processing and task failure handling (DLQ behavior).
- To run tests:
docker compose run --rm test_runner -s
- Example test files include:
tests/send_webhook_test.pytests/task_failure_test.py
- FastAPI Documentation
- Celery Documentation
- Kafka Documentation
- Prometheus Client Python
- Grafana Documentation
Happy monitoring!