Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Conversation

@gadenbuie
Copy link
Collaborator

Also applies to stream_async()

@gadenbuie gadenbuie marked this pull request as ready for review May 9, 2025 14:01
R/chat.R Outdated
#' best for interactive applications, especially when a tool may involve
#' an interactive user interface. Concurrent mode is best for automated
#' scripts or non-interactive applications.
chat_async = function(..., tool_mode = c("sequential", "concurrent")) {
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you think we should default to concurrent or sequential tool calling in the async chat/stream methods?

  • concurrent: status quo, makes sense for non-interactive use cases
  • sequential: best when the chat is powering a user interface, e.g. shinychat

Ultimately it's unlikely to matter much for most tools unless they're async or correctly set up to use true parallel processing. "sequential" seems like a good default, but if you've gone through the trouble of setting up future or mirai, it'd be nicer for async tool calling to just work than to also need to change this argument value

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Go with your gut. I think shinychat is going to be 99% of the users of this function 😄

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks! Okay, my gut says "concurrent". "sequential" is the weirder use-case, and the distinction only becomes relevant at the most extreme end of the async spectrum: async streaming with async tool functions that do actual async work.

@gadenbuie gadenbuie merged commit a7e5161 into tidyverse:main May 9, 2025
10 checks passed
@gadenbuie gadenbuie deleted the feat/async-tool-mode branch May 9, 2025 15:07
schloerke added a commit to schloerke/ellmer that referenced this pull request May 15, 2025
* main: (26 commits)
  Drop `completed` (tidyverse#495)
  Add PortkeyAI support (tidyverse#471)
  feat: Adds `stream = "content"` option to `$stream()` and `$stream_async()` (tidyverse#494)
  feat(Chat): `tool_request` and `tool_result` callbacks (tidyverse#493)
  feat: Add `tool_reject()` (tidyverse#490)
  fix: Tweak simple tools test to avoid Claude's empty strings (tidyverse#492)
  feat(chat_async): Adds `tool_mode` to choose between sequential or concurrent tool calling (tidyverse#488)
  And azure :|
  Update snapshot tests; fix bad bedrock model choice
  chore: Remove trailing whitespace from NEWS (tidyverse#489)
  Add (some) functions for listing models (tidyverse#454)
  Move `chat$extract_data_parallel()` to `parallel_chat_structured()` (tidyverse#486)
  refactor: Use generators for tool invocation (tidyverse#487)
  refactor: Remove private invoke_tools methods (tidyverse#485)
  Automatically convert tool inputs (tidyverse#463)
  Allow `$extract_data_parallel()` to return tokens + cost (tidyverse#449)
  Start work on programming vignette (tidyverse#458)
  Revise type coercion (tidyverse#484)
  Tweak `chat_openai_test()` (tidyverse#483)
  refactor(tools): `match_tools()` and `tool_results_as_turn()` (tidyverse#480)
  ...
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants