Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Tags: daureg/litellm

Tags

v1.67.3.dev1

Toggle v1.67.3.dev1's commit message
bump: version 1.67.2 → 1.67.3

v1.67.2-nightly

Toggle v1.67.2-nightly's commit message
test: update test to not use gemini-pro

google removed it

v1.67.1-nightly

Toggle v1.67.1-nightly's commit message

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature.
Gemini-2.5-flash improvements (BerriAI#10198)

* fix(vertex_and_google_ai_studio_gemini.py): allow thinking budget = 0

Fixes BerriAI#10121

* fix(vertex_and_google_ai_studio_gemini.py): handle nuance in counting exclusive vs. inclusive tokens

Addresses BerriAI#10141 (comment)

v1.67.0-stable

Toggle v1.67.0-stable's commit message

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature.
test(utils.py): handle scenario where text tokens + reasoning tokens … (

BerriAI#10165)

* test(utils.py): handle scenario where text tokens + reasoning tokens set, but reasoning tokens not charged separately

Addresses BerriAI#10141 (comment)

* fix(vertex_and_google_ai_studio.py): only set content if non-empty str

v1.67.0-nightly

Toggle v1.67.0-nightly's commit message
bump: version 1.66.3 → 1.67.0

v1.66.3.dev5

Toggle v1.66.3.dev5's commit message

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature.
[Feat] Support for all litellm providers on Responses API (works with…

… Codex) - Anthropic, Bedrock API, VertexAI, Ollama (BerriAI#10132)

* transform request

* basic handler for LiteLLMCompletionTransformationHandler

* complete transform litellm to responses api

* fixes to test

* fix stream=True

* fix streaming iterator

* fixes for transformation

* fixes for anthropic codex support

* fix pass response_api_optional_params

* test anthropic responses api tools

* update responses types

* working codex with litellm

* add session handler

* fixes streaming iterator

* fix handler

* add litellm codex example

* fix code quality

* test fix

* docs litellm codex

* litellm codexdoc

* docs openai codex with litellm

* docs litellm openai codex

* litellm codex

* linting fixes for transforming responses API

* fix import error

* fix responses api test

* add sync iterator support for responses api

v1.66.3.dev1

Toggle v1.66.3.dev1's commit message
temp demo

v1.66.3-nightly

Toggle v1.66.3-nightly's commit message
bump: version 1.66.2 → 1.66.3

v1.66.2.dev1

Toggle v1.66.2.dev1's commit message
bump: version 1.66.2 → 1.66.3

v1.66.2-nightly

Toggle v1.66.2-nightly's commit message

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature.
fix(llm_http_handler.py): fix fake streaming (BerriAI#10061)

* fix(llm_http_handler.py): fix fake streaming

allows groq to work with llm_http_handler

* fix(groq.py): migrate groq to openai like config

ensures json mode handling works correctly