Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Not sure why getting this response #1017

@shivsant

Description

@shivsant

I am getting this issue when I am transitioning to another sub agent from root_agent and then it does a function calling.

{"error": "litellm.BadRequestError: OpenAIException - Error code: 400 - {'StatusCode': 400, 'Message': 'Gaas:Invalid Request', 'CorrelationId': ''}"}

Need some pointers on this - how to debug.

I am using openai llm through litellm.

Metadata

Metadata

Assignees

Labels

models[Component] Issues related to model support

Type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions