Thanks to visit codestin.com
Credit goes to github.com

Skip to content

πŸ› Bug: inconsistent behavior when interrupting message streams between agent mode (using LangGraph Python) and direct LLM modeΒ #2711

@suppergmax

Description

@suppergmax

♻️ Reproduction Steps

I've noticed inconsistent behavior when interrupting message streams between agent mode (using LangGraph Python) and direct LLM mode:

  • Agent Mode (LangGraph Python):
    When I interrupt a streaming response and immediately send a new message, there's no response
    After waiting a while and sending another message, both the previously cancelled reply and the new reply appear together
    The interrupted message doesn't seem to be properly cancelled
    This can be reproduced using the Research Canvas demo: https://www.copilotkit.ai/examples/canvas-research

  • Direct to LLM Mode:
    When I interrupt a streaming response and immediately send a new message, it responds immediately with the latest message
    Works as expected

βœ… Expected Behavior

How can I make agent mode behave consistently with direct LLM mode when handling message interruptions? Is there a configuration or implementation pattern I should follow to ensure interrupted agent requests are properly cancelled and new messages are processed immediately?

❌ Actual Behavior

https://www.copilotkit.ai/examples/canvas-research

20251112-192044.mp4

𝌚 CopilotKit Version

demo used https://www.copilotkit.ai/examples/canvas-research

pnpm list --depth=0 | findstr /i "@copilotkit"
@copilotkit/runtime 1.10.2
@copilotkit/runtime-client-gql 1.10.2
@copilotkit/shared 1.10.2

πŸ“„ Logs (Optional)

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions