Conversation
|
|
Couldn't run the e2e tests. Got this issue running If this PR looks alright, I'd appreciate if it a maintainer could run the Cheers |
Based on #1689 by @dylnslck. Adds optional chaining and deprecated property fallbacks to token detail access across both generateObject and generateText code paths, matching the safe pattern already used in the generateObject path of aisdk.ts. Renames `u` to `usage` for consistency with the rest of the codebase. Co-Authored-By: Dylan Slack <[email protected]> Co-Authored-By: Claude Opus 4.6 <[email protected]>
Based on #1689 by @dylnslck. Adds optional chaining and deprecated property fallbacks to token detail access across both generateObject and generateText code paths, matching the safe pattern already used in the generateObject path of aisdk.ts. Renames `u` to `usage` for consistency with the rest of the codebase. Co-Authored-By: Dylan Slack <[email protected]> Co-Authored-By: Claude Opus 4.6 <[email protected]>
Based on #1689 by @dylnslck. Adds optional chaining and deprecated property fallbacks to token detail access across both generateObject and generateText code paths, matching the safe pattern already used in the generateObject path of aisdk.ts. Renames `u` to `usage` for consistency with the rest of the codebase. Co-Authored-By: Dylan Slack <[email protected]> Co-Authored-By: Claude Opus 4.6 <[email protected]>
why
Updates the
aiSDK and all of its providers to the latest versions. This fixes #1645.what changed
Dependencies: Upgraded
aifrom ^5.0.133 to ^6.0.0,@ai-sdk/providerfrom ^2.0.0 to ^3.0.0, and all optional AI provider packages (anthropic, openai, google, etc.) to their v3/v4 counterparts.Language model migration: Migrated from
LanguageModelV2toLanguageModelV3across AISdkClient, LLMClient, and related typings.API changes: Replaced deprecated
generateObject/streamObjectwithgenerateText/streamTextusingOutput.object({ schema }). AddedobjectShims.tsto preserve the{ object }andpartialObjectStreamshapes for callers.Tool integration: Updated all agent tools (click, type, scroll, dragAndDrop, fillFormVision, screenshot, ariaTree, wait) to use the new tool result shape:
toModelOutputcallbacks now receive{ output }instead ofresult.Message types: Switched from
CoreSystemMessage,CoreUserMessage, andCoreAssistantMessagetoModelMessage.Flow logger: Added
specificationVersion: "v3"to the LLM logging middleware.Usage handling: Updated usage token access for the new structure (
outputTokenDetails?.reasoningTokens,inputTokenDetails?.cacheReadTokens, etc.).backwards compatibility
LLMClient keeps the old
generateObjectandstreamObjectAPI surface.objectShims.tsimplements these by wrappinggenerateText/streamTextwithOutput.object({ schema })and mapping the results:generateObjectShimreturns{ object, ...rest }(instead of{ output }) so callers that destructure{ object }still workstreamObjectShimexposespartialObjectStreamas an alias forpartialOutputStreamNo changes required for code that uses
llmClient.generateObject()orllmClient.streamObject().test plan
pnpm installandpnpm buildsucceedpnpm testpasses (unit/vitest tests)pnpm e2e:localorpnpm e2epasses to verify agent flows with the upgraded SDKLanguageModelV3models from current provider packages (e.g.@ai-sdk/openai^3.x) without TypeScript errorsSummary by cubic
Upgrade the AI SDK to v6 and all providers to their latest major versions, migrating to LanguageModelV3 and the new structured output API while keeping our public LLMClient API intact. This improves provider compatibility and fixes #1645.
Dependencies
Migration
Written for commit ff7bbd8. Summary will update on new commits. Review in cubic