Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Tags: sofatutor/litellm

Tags

v1.80.0-stable.1-sofatutor

Toggle v1.80.0-stable.1-sofatutor's commit message
fix(responses-api): support instructions as list when using prompt ob…

…jects

When using the Responses API with a prompt object, OpenAI returns
the instructions field as a list of message objects (expanded from
the prompt template) rather than a string.

The OpenAI SDK correctly defines this as:
  instructions: Union[str, List[ResponseInputItem], None]

But LiteLLM's ResponsesAPIResponse had:
  instructions: Optional[str]

This caused a Pydantic ValidationError when streaming responses
tried to parse ResponseCreatedEvent because it expected a string
but received a list.

This fix updates the type to accept both formats:
  instructions: Optional[Union[str, ict[str, Any]]]]List

Added tests for:
- Non-streaming responses with instructions as list
- Non-streaming responses with instructions as string
- Streaming events (ResponseCreatedEvent, ResponseInProgressEvent,
  ResponseCompletedEvent) with instructions as list

v1.80.0-stable.1

Toggle v1.80.0-stable.1's commit message
Non root docker build fix

v1.80.5.rc.2

Toggle v1.80.5.rc.2's commit message
ui new build

v1.80.5.rc.1

Toggle v1.80.5.rc.1's commit message
ui new build

v1.80.5-nightly

Toggle v1.80.5-nightly's commit message
fix ui unit tests fuck this test why is it so flaky

v1.78.5-stable-patch-1

Toggle v1.78.5-stable-patch-1's commit message
fix: prevent memory blowout in LoggingWorker

Previously, tasks were executed sequentially, awaiting each task before
starting the next. With large queues (10k+ tasks), this caused objects to
accumulate in memory, holding references to heavy resources and leading
to memory blowouts.

This update:

1. Introduces a semaphore to allow a configurable number of concurrent tasks,
   improving throughput and preventing queue buildup.
2. Implements a configurable cleaning mechanism for when the queue reaches its
   limit, ensuring tasks are not dropped.

v1.80.0.dev6

Toggle v1.80.0.dev6's commit message
bump openai 2.8.0

v1.80.0.dev2

Toggle v1.80.0.dev2's commit message

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature.
[Fix] AI Gateway Auth - Ensure Team Tags works when using JWT Auth (B…

…erriAI#16797)

* fix setting team_metadata

* test_team_metadata_with_tags_flows_through_jwt_auth

v1.79.3-stable.gemini3

Toggle v1.79.3-stable.gemini3's commit message
Add Day 0 gemini-3-pro-preview support (BerriAI#16719)

* Add thinking signature support for gemini

* Add docs related to thinking signature

* remove double base64 import

* fix mypy errors

* fix litellm/llms/vertex_ai/gemini/vertex_and_google_ai_studio_gemini.py mypy

* Add new gemini 3 model and features

* Add docs related to gemini 3

* Update gemini 3 pricing

* fix llm translation tests

* fix mapped tests

v1.77.3-stable-patch-2

Toggle v1.77.3-stable-patch-2's commit message
fix: add None check for litellm_params

We currently cover the first case where the `litellm_params` may not be there, but not when it could be None