Tags: mattf/litellm
Tags
(fix) Pass through spend tracking - ensure `custom_llm_provider` is t… …racked for Vertex, Google AI Studio, Anthropic (BerriAI#8882) * fix track custom llm provider on pass through routes * fix use correct provider for google ai studio * fix tracking custom llm provider on pass through route * ui fix get provider logo * update tests to track custom llm provider * test_anthropic_streaming_with_headers * Potential fix for code scanning alert no. 2263: Incomplete URL substring sanitization Co-authored-by: Copilot Autofix powered by AI <62310815+github-advanced-security[bot]@users.noreply.github.com> --------- Co-authored-by: Copilot Autofix powered by AI <62310815+github-advanced-security[bot]@users.noreply.github.com>
fix(sagemaker/completion/handler.py): fix typo Fixes BerriAI#8863
fix(proxy/_types.py): fixes issue where internal user able to escalat… ( BerriAI#8740) * fix(proxy/_types.py): fixes issue where internal user able to escalate their role with ui key Fixes BerriAI#8029 * style: cleanup * test: handle bedrock instability
(Infra/DB) - Allow running older litellm version when out of sync wit… …h current state of DB (BerriAI#8695) * fix check migration * clean up should_update_prisma_schema * update test * db_migration_disable_update_check * Check container logs for expected message * db_migration_disable_update_check * test_check_migration_out_of_sync * test_should_update_prisma_schema * db_migration_disable_update_check * pip install aiohttp
PreviousNext