SDK project for use different LLM
Clone the repository
git clone https://github.com/bazhil/llm_factory.git
Create .env
cp .env.example .env
Fill variables in .env
DEEPSEEK_API_KEY=
GIGA_CHAT_AUTH_KEY=
OLLAMA_HOST=
OLLAMA_MODEL=
YANDEX_GPT_FOLDER_ID=
YANDEX_GPT_API_KEY=
OPENAI_API_KEY=
Install requirements:
pip install -r requirements.txt
Where:
PROVIDER - name of target llm (ollama / deepseek / openai / yandex / gigachat)
DEEPSEEK_API_KEY - Deepseek API KEY
GIGA_CHAT_AUTH_KEY - GigaChat Auth Key
OLLAMA_HOST - host, where running ollama
OLLAMA_MODEL - model of using ollama
YANDEX_GPT_FOLDER_ID - Yandex GPT Folder ID
YANDEX_GPT_API_KEY - Yandex GPT API KEY
OPENAI_API_KEY - OpenAI API Key
We welcome contributions!
- Fork the repository
- Create your feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.
- Thanks to all contributors
- Inspired by the need for reusing unified LLM interface