unexpected xAi reasoning model output #1812
Closed
Dreaming-Codes
started this conversation in
Bug Reports
Replies: 1 comment 1 reply
-
Hi @Dreaming-Codes - I was able to reproduce this issue. It seems that xAI is (incorrectly) always returning JSON when you send We'll stop sending this default value by default. In the meantime (until we fix + release), please set For example, assuming you had a function curl -X POST "http://localhost:3000/inference" \
-H "Content-Type: application/json" \
-d '{
"function_name": "generate_haiku",
"input": {
"messages": [
{
"role": "user",
"content": "Write a haiku about artificial intelligence."
}
]
},
"extra_body": [{"variant_name": "grok_3_mini_beta", "pointer": "/response_format", "value": null }]
}' This works as you'd expect (no JSON). Thanks for reporting! |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
As shown in the image, xAI reasoning models like grok-3-mini are generating output with a "response" key, where the text is the value. This is problematic because it differs from other providers' formats, causing my application to malfunction when requests are routed to xAI. I'm uncertain where this particular request format originates, as direct requests to the xAI OpenAPI endpoint do not exhibit this behavior.
Beta Was this translation helpful? Give feedback.
All reactions