Matchly is a full-stack AI-powered resume intelligence platform that analyzes resumes and job descriptions, generates tailored cover letters, and provides insightful feedback. Built with Next.js, FastAPI, and containerized microservices.
- Frontend: Next.js with shadcn/ui
- Backend: FastAPI
- Vector Database: ChromaDB
- Containerization: Docker & Docker Compose
- Styling: Tailwind CSS, Google Fonts (Geist)
- PDF Parsing: PyMuPDF (via
fitz) - LangChain: Prompt templating & chaining
- Ollama (Mistral): Local LLM backend
Matchly/
├── frontend/ # Next.js frontend app
│ ├── app/ # App router and layout logic
│ ├── components/ # UI components (ResumeUploader, JDInput, etc.)
│ ├── lib/
│ │ └── api.ts # Frontend API interaction with FastAPI
│ └── ui/ # shadcn/ui components
│
├── backend/ # FastAPI backend
│ ├── main.py # Application entrypoint
│ ├── resume_utils/ # Resume parsing, embedding, matching logic
│ │ ├── parser.py # PyMuPDF-based PDF text extraction
│ │ ├── embedder.py # Text embedding for vector comparison
│ │ └── matcher.py # Matching algorithm between resume and JD
│ └── api/ # FastAPI route handlers
│ └── process.py # /api/process endpoint
│
├── docker-compose.yml # Orchestrates containers
└── README.md # Project documentation
- File upload interface for resumes and job descriptions
- Input fields for job title and hiring manager name
- Editable text area for the generated cover letter
- Copy to clipboard and download as PDF functionality for the cover letter
- Handles FormData submission to FastAPI backend
- Responsive UI built with Tailwind CSS and shadcn/ui components
- Resume and JD parsing with PyMuPDF (fitz)
- Embedding and similarity scoring using ChromaDB
- Cover letter generation via LangChain-prompted Mistral LLM (local via Ollama)
- Supports RAG, prompt chaining, tone control, and personalized output
- REST API exposed at /api/process with CORS enabled
- Local LLM Inference: Integrated Mistral via Ollama for fast, private, and offline cover letter generation.
- LangChain Prompt Chaining: Modular prompt templates and chaining logic enable role-specific, tone-adjustable letters.
- RAG Workflow with ChromaDB: Combines resume and job description context via a Retrieval-Augmented Generation (RAG) setup, using ChromaDB for fast vector similarity search.
- Semantic Personalization: Prompts adapt based on parsed resume, job title, and hiring manager details for contextual generation.
- Editable AI Output: LLM-generated cover letters are editable in the UI to allow manual refinement and customization.
- Dockerized setup for frontend, backend, and ChromaDB
- Local development using Docker Compose
- Environment supports hot reload for both FastAPI and Next.js
- Easy extension for cloud deployment (Render / Railway / Vercel)
This project uses Docker Compose to run the full stack: Next.js frontend + FastAPI backend + ChromaDB vector store.
git clone https://github.com/sourabhaprasad/matchly.git
cd matchlydocker-compose up --buildThis will:
- Serve the Next.js frontend on http://localhost:3000
- Start the FastAPI backend on http://localhost:8001
- Launch the ChromaDB vector database
If you modify code or dependencies:
docker-compose down
docker-compose up --buildAccepts multipart FormData and returns match results and a generated cover letter.
| Field | Type | Required | Description |
|---|---|---|---|
resume |
File | Yes | Resume in PDF format |
job_description |
String | Yes | Job description in PDF format |
job_title |
String | Optional | Job role title |
hiring_manager |
String | Optional | Hiring manager's name for personalization |
tone |
String | Optional | Tone of the cover letter (e.g., "professional", "friendly") |
{
"match_score": 0.84,
"matched_skills": ["Python", "Machine Learning", "Data Analysis"],
"uunmatched_skills": ["Kubernetes", "AWS Lambda"],
"cover_letter": "Dear Hiring Manager,\nI am excited to apply for the role of..."
}ChromaDB is used to store and compare semantic embeddings between resumes and job descriptions.
- Persistence: Stores vector data locally at
./chroma_db/ - In-Process: No separate service required; runs with FastAPI
To reset embeddings:
rm -rf backend/chroma_db/-
Integrate GPT-based or OSS LLMs for:
- Enhanced resume and JD understanding
- Dynamic and role-specific cover letter generation
-
Improve scoring with custom similarity metrics and weights
-
Add user authentication (via Supabase or NextAuth)
-
Store upload history, cover letters, and match scores per user
-
Production deployment with CI/CD pipelines
matchly-video.demo.mp4
-
Install a new shadcn/ui component:
npx shadcn@latest add <component>
-
Add a frontend dependency without rebuilding the container:
docker exec -it <frontend-container-id> sh npm add <package-name>
-
Common Issues:
-
Cannot find module 'sonner'- Run:
npm add sonnerinside the frontend directory
- Run:
-
CORS errors on frontend
- Ensure CORS middleware is enabled in
main.pyin FastAPI backend
- Ensure CORS middleware is enabled in
-
- Sourabha Prasad – GitHub