You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I've noticed inconsistent behavior when interrupting message streams between agent mode (using LangGraph Python) and direct LLM mode:
Agent Mode (LangGraph Python):
When I interrupt a streaming response and immediately send a new message, there's no response
After waiting a while and sending another message, both the previously cancelled reply and the new reply appear together
The interrupted message doesn't seem to be properly cancelled
This can be reproduced using the Research Canvas demo: https://www.copilotkit.ai/examples/canvas-research
Direct to LLM Mode:
When I interrupt a streaming response and immediately send a new message, it responds immediately with the latest message
Works as expected
β Expected Behavior
How can I make agent mode behave consistently with direct LLM mode when handling message interruptions? Is there a configuration or implementation pattern I should follow to ensure interrupted agent requests are properly cancelled and new messages are processed immediately?