This is a scalable backend application for user authentication and event logging. Designed with a modular structure for maintainability and growth. Built with Fastify, Kafka, TypeScript, and Zod β it integrates PostgreSQL, DynamoDB, and ClickHouse to handle user management, events, and enable powerful analytics. Ideal for modern, event-driven systems that require observability, traceability, and flexibility.
- Backend: Fastify, TypeScript
- Event Streaming: Kafka
- Database: PostgreSQL, ClickHouse, DynamoDB
- Authentication: JWT + Bcrypt
- Kafka Client: kafkajs
- Validation: Zod
- β User registration
- β Login with JWT token generation
- β Event-driven architecture with Kafka
- β
Event logs stored in:
- PostgreSQL for relational data
- ClickHouse for analytics
- DynamoDB for fast key-based access
- β TypeScript with Zod validation and password hashing
- β Custom metadata logging: IP address, user agent, service, environment, etc.
- β Environment-based configuration
.
βββ docker/ # Docker config and inits
βββ prisma/ # Prisma models and migrations
βββ src/
β βββ config/ # Environment configs
β βββ consumers/ # Kafka consumers
β βββ controllers/ # Route handlers (e.g. auth)
β βββ middlewares/ # Middlewares (e.g. auth)
β βββ services/ # Business logic (users, logs, events, consumer)
β βββ routes/ # Route definitions
β βββ models/ # Structure definitions and schemas
β βββ utils/ # Utilities (e.g. JWT, logger)
β βββ types/ # All Zod schemas
β βββ index.ts # Main Fastify server
βββ .env # Environment variables
βββ README.md
βββ docker-compose.yaml
git clone https://github.com/your-username/fastify-events-stream.git
cd fastify-events-streamCreate a .env file with required variables, you can check the example.env.
docker compose up -dnpm install
npm run buildnpm run generate-db
npm run create-dDb-tablenpm run startnpm run consumer-upRegisters a new user and emits a Kafka event.
{
"username": "john_doe",
"email": "[email protected]",
"password": "StrongPass123!"
}Returns a JWT token on successful login.
{
"username": "john_doe",
"password": "StrongPass123!"
}Create custom events (auth Bearer is require)
{
"serviceName": "user_test"
}Returns information about events. All routes are protected by the auth middleware, so a valid Bearer token is required.
api/v1/auth/me- Get information about meapi/v1/events/:eventId- Get information about a eventapi/v1/events/recent?topEvents=5- Get the top # of eventsapi/v1/analytics/top-events- Get the top 10 events informationapi/v1/analytics/user/:userId- Get the events by userId
Each action emits a Kafka event with rich metadata:
{
id: 'uuid',
event_type: 'analytics_user',
user_id: "uuid",
action_type: "events",
metadata: {
ip_address: "192.168.1.1",
user_agent: "Mozilla/5.0",
resource_type: "auth",
timestamp: "2025-04-13T12:34:56Z",
result: "success",
service_name: "auth-service",
environment: "dev",
version: "1.0.0"
}
}- Event-driven logging decouples user actions from analytics or audit systems.
- Typed contracts ensure strong consistency across producers and consumers.
- Zod provides centralized and type-safe input validation.
- JWT-based auth allows for stateless, scalable session handling.
- Kafka consumers are grouped by domain (
auth,analytics, etc.) to ensure independence.
You can use Postman or any HTTP client to test endpoints. Kafka events can be observed via Kafka UI or logs. Logs are persisted in PostgreSQL and ClickHouse.
- Add unit/integration tests
- Add refresh token logic
- [β ] Handle the already exist error in the consumer
- [β ] Add OpenAPI (Swagger) docs
- [β ] Make modular the consumer
- [β ] Create docker for run the API
- [β ] Create docker for run the consumer
Feel free to open issues or submit pull requests to improve the project.