Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Conversation

Zentorno
Copy link
Contributor

@Zentorno Zentorno commented Sep 16, 2025

Problem:

An error occurs when using langfuse-python with OpenRouter as the base_url and deepseek/deepseek-r1-0528 as the model in a function calling scenario.

Error:
error

Root Cause:

  1. At Time A, the first chunk received from the streaming API contains tool_calls.function.arguments with a value of None.
    Chunk: {'id': 'gen-1757583928-Bk8WIfqQDQprOXBvlnP4', 'choices': [Choice(delta=ChoiceDelta(content=None, function_call=None, refusal=None, role='assistant', tool_calls=[ChoiceDeltaToolCall(index=0, id='chatcmpl-tool-005b7a71760046f3aa3205fc354342f9', function=ChoiceDeltaToolCallFunction(arguments=None, name='builtin_search_web'), type='function')]), finish_reason=None, index=0, logprobs=None, native_finish_reason=None)], 'created': 1757583931, 'model': 'deepseek/deepseek-r1-0528', 'object': 'chat.completion.chunk', 'service_tier': None, 'system_fingerprint': '', 'usage': None}

  2. The _extract_streamed_openai_response function processes this chunk, and the curr variable is set to:
    curr: [{'name': 'builtin_search_web', 'arguments': None}]

  3. At Time B, a subsequent chunk is received.
    Chunk: {'id': 'gen-1757583928-Bk8WIfqQDQprOXBvlnP4', 'choices': [Choice(delta=ChoiceDelta(content=None, function_call=None, refusal=None, role='assistant', tool_calls=[ChoiceDeltaToolCall(index=0, id=None, function=ChoiceDeltaToolCallFunction(arguments='{"', name=None), type='function')]), finish_reason=None, index=0, logprobs=None, native_finish_reason=None)], 'created': 1757583931, 'model': 'deepseek/deepseek-r1-0528', 'object': 'chat.completion.chunk', 'service_tier': None, 'system_fingerprint': '', 'usage': None}

  4. The _extract_streamed_openai_response function attempts to append the new arguments chunk to curr[-1]["arguments"]. Since curr[-1]["arguments"] is None, a type error occurs when trying to concatenate a string ('{"') to None.

Fix:
Before performing the string concatenation, check if the current value of curr[-1]["arguments"] is None. If it is, initialize it as an empty string ("") to prevent the error.


Important

Fixes type error in _extract_streamed_openai_response by initializing arguments as an empty string if None.

  • Behavior:
    • Fixes type error in _extract_streamed_openai_response in openai.py when tool_calls.function.arguments is None.
    • Initializes curr[-1]["arguments"] as an empty string if None before concatenation.
  • Misc:
    • No changes to other files or functions.

This description was created by Ellipsis for a38f72b. You can customize this summary. It will automatically update as commits are pushed.

Disclaimer: Experimental PR review

Greptile Summary

Updated On: 2025-09-16 03:03:22 UTC

This PR fixes a bug in the OpenAI integration's streaming tool calls functionality. The issue occurs when using OpenAI-compatible providers like OpenRouter with certain models (specifically deepseek/deepseek-r1-0528) that return None for tool call arguments in initial streaming chunks, rather than empty strings.

The fix is implemented in langfuse/openai.py within the _extract_streamed_openai_response function at lines 645-650. The solution adds a null check before attempting string concatenation:

if curr[-1]["arguments"] is None:
    curr[-1]["arguments"] = ""

curr[-1]["arguments"] += getattr(
    tool_call_chunk, "arguments", None
)

This change ensures compatibility across different OpenAI-compatible providers by normalizing None values to empty strings before concatenation. The streaming tool calls functionality processes responses incrementally, building up the complete arguments string from multiple chunks. When the first chunk contains None instead of an empty string, the existing code would fail with a TypeError when trying to concatenate subsequent string chunks to None.

The fix maintains the same final result while preventing runtime errors through defensive programming. It specifically addresses the edge case where different providers handle initial streaming responses differently, without changing the core functionality of tool call argument processing.

Confidence score: 4/5

  • This PR is safe to merge with low risk as it's a targeted bug fix for a specific edge case
  • Score reflects simple defensive programming change that handles provider differences without affecting core logic
  • The fix only touches the OpenAI streaming functionality and includes proper null checking

@CLAassistant
Copy link

CLAassistant commented Sep 16, 2025

CLA assistant check
All committers have signed the CLA.

Copy link
Contributor

@greptile-apps greptile-apps bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

1 file reviewed, no comments

Edit Code Review Bot Settings | Greptile

if curr[-1]["arguments"] is None:
curr[-1]["arguments"] = ""

curr[-1]["arguments"] += getattr(
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good fix: initializing curr[-1]['arguments'] as empty string avoids TypeError. However, consider also defaulting the getattr call (e.g. using '' instead of None) so that if tool_call_chunk.arguments is None, concatenation won’t fail.

@hassiebp
Copy link
Contributor

Thanks for your contribution, @Zentorno!

@hassiebp hassiebp merged commit f153b59 into langfuse:main Sep 16, 2025
1 check passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants