Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Tags: idris/litellm

Tags

v1.42.3

Toggle v1.42.3's commit message
fix(vertex_ai_llama3.py): Fix llama3 streaming issue

Closes BerriAI#4885

v1.42.3-stable

Toggle v1.42.3-stable's commit message
docs(custom_llm_server.md): cleanup docs

v1.42.2

Toggle v1.42.2's commit message
bump: version 1.42.1 → 1.42.2

v1.42.2-stable

Toggle v1.42.2-stable's commit message
bump: version 1.42.1 → 1.42.2

v1.42.1

Toggle v1.42.1's commit message
bump: version 1.42.0 → 1.42.1

v1.42.0

Toggle v1.42.0's commit message

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature.
Merge pull request BerriAI#4856 from msabramo/fix-test_prompt_factory…

…-flake8-warning

Fix `test_prompt_factory` flake8 warning

v1.42.0-stable

Toggle v1.42.0-stable's commit message
build(docker-compose.yml): add prometheus scraper to docker compose

persists prometheus data across restarts

v1.41.28

Toggle v1.41.28's commit message
bump: version 1.41.27 → 1.41.28

v1.41.27

Toggle v1.41.27's commit message
fix(proxy/utils.py): add stronger typing for litellm params in failur…

…e call logging

v1.41.26

Toggle v1.41.26's commit message

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature.
Merge pull request BerriAI#4819 from BerriAI/revert-4613-main

Revert "Fix: use Bedrock region from environment variables before other region definitions"