Python 3.9 – 3.12 is required to run Cognee.
Prerequisites
Environment Configuration
Environment Configuration
- We recommend creating a
.env
file in your project root - Cognee supports many configuration options, and a
.env
file keeps them organized
API Keys & Models
API Keys & Models
You have two main options for configuring LLM and embedding providers:Option 1: OpenAI (Simplest)
- Single API key handles both LLM and embeddings
- Uses gpt-4o-mini for LLM and text-embedding-3-small for embeddings by default
- Works out of the box with minimal configuration
- Configure both LLM and embedding providers separately
- Supports Gemini, Anthropic, Ollama, and more
- Requires setting both
LLM_*
andEMBEDDING_*
variables
By default, Cognee uses OpenAI for both LLMs and embeddings. If you change the LLM provider but don’t configure embeddings, it will still default to OpenAI.
Virtual Environment
Virtual Environment
- We recommend using uv for virtual environment management
- Run the following commands to create and activate a virtual environment:
Optional
Optional
Database
Database
- PostgreSQL database is required if you plan to use PostgreSQL as your relational database (requires
postgres
extra)
Setup
- OpenAI (Recommended)
- Other Providers (Gemini, Anthropic, etc.)
Environment: Add your OpenAI API key to your Installation: Install Cognee with all extras:What this gives you: Cognee installed with default local databases (SQLite, LanceDB, Kuzu) — no external servers required.
.env
file:This single API key handles both LLM and embeddings. We use gpt-4o-mini for the LLM model and text-embedding-3-small for embeddings by default.