-
Notifications
You must be signed in to change notification settings - Fork 433
fix: not process attached key to LLM #4545
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
Walkthrough本次更改主要引入了用于文件和文件夹占位符的新枚举和正则表达式常量,并在聊天相关逻辑中用这些符号化键替换了硬编码字符串。同时,调整了提及输入组件的菜单项顺序,并为提及标签添加了字体大小样式。此外,对对话框服务的隐藏方法进行了空值安全处理。 Changes
Sequence Diagram(s)sequenceDiagram
participant User
participant ChatView
participant LLMContext
User->>ChatView: 发送消息(含文件/文件夹占位符)
ChatView->>LLMContext: 用 LLM_CONTEXT_KEY 替换占位符
ChatView->>ChatView: 移除历史消息中匹配 LLM_CONTEXT_KEY_REGEX 的内容
ChatView->>User: 显示处理后的消息
Possibly related PRs
Suggested labels
Warning There were issues while running some tools. Please review the errors and either fix the tool's configuration or disable the tool if it's a critical failure. 🔧 ESLint
yarn install v1.22.22 Tip ⚡️ Faster reviews with caching
Enjoy the performance boost—your workflow just got faster. ✨ Finishing Touches
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Create a ticket on our support page for assistance with any issues or questions. Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (5)
packages/ai-native/src/browser/chat/chat.view.tsx
(5 hunks)packages/ai-native/src/browser/components/ChatMentionInput.tsx
(1 hunks)packages/ai-native/src/browser/components/mention-input/mention-input.module.less
(1 hunks)packages/ai-native/src/common/llm-context.ts
(1 hunks)packages/overlay/src/browser/dialog.service.ts
(1 hunks)
🧰 Additional context used
🧬 Code Graph Analysis (1)
packages/ai-native/src/browser/chat/chat.view.tsx (1)
packages/ai-native/src/common/llm-context.ts (1)
LLM_CONTEXT_KEY_REGEX
(74-74)
⏰ Context from checks skipped due to timeout of 90000ms (8)
- GitHub Check: build (ubuntu-latest, 20.x)
- GitHub Check: build-windows
- GitHub Check: build (macos-latest, 20.x)
- GitHub Check: unittest (ubuntu-latest, 18.x, node)
- GitHub Check: unittest (ubuntu-latest, 18.x, jsdom)
- GitHub Check: unittest (macos-latest, 18.x, jsdom)
- GitHub Check: unittest (macos-latest, 18.x, node)
- GitHub Check: ubuntu-latest, Node.js 20.x
🔇 Additional comments (9)
packages/overlay/src/browser/dialog.service.ts (1)
67-67
: 增强了防护性编程使用可选链操作符(
?.
)可以防止在this.deferred
为 null 或 undefined 时的运行时错误,提高了代码的健壮性。packages/ai-native/src/common/llm-context.ts (2)
69-72
: 良好的常量定义方式将附加文件和文件夹的标记抽取为枚举常量很好地集中了这些值的管理,提高了代码的可维护性。
74-74
: 有效集中正则表达式定义定义一个正则表达式常量以匹配附加文件和文件夹标记,使得在代码中可以统一处理这些标记,避免硬编码重复。
packages/ai-native/src/browser/components/mention-input/mention-input.module.less (1)
330-330
: 提升 UI 一致性为提及标签添加明确的字体大小设置,确保了 UI 显示的一致性。
packages/ai-native/src/browser/chat/chat.view.tsx (4)
44-48
: 合理扩展导入扩展导入以包含新定义的枚举和正则表达式常量,为后续修改做准备。
709-710
: 使用枚举替代硬编码字符串使用
LLM_CONTEXT_KEY.AttachedFile
枚举值替代硬编码的字符串,提高了代码的可维护性和一致性。
720-721
: 使用枚举替代硬编码字符串使用
LLM_CONTEXT_KEY.AttachedFolder
枚举值替代硬编码的字符串,保持了与文件处理的一致性。
737-739
: 一致使用常量进行代码引用处理在处理代码引用时也使用了枚举常量,确保整个代码库中标记的处理方式保持一致,避免潜在的不匹配问题。
packages/ai-native/src/browser/components/ChatMentionInput.tsx (1)
267-314
: 顺序调整符合优化提及项显示逻辑将"Code"提及项从数组开头移动到末尾,这个调整与PR的目标一致,有助于确保附加键不被LLM处理。此更改与其他文件中使用
LLM_CONTEXT_KEY
枚举标准化引用上下文的方式相协调。菜单项的重新排序不影响功能,只改变了UI中显示的顺序,使用户界面更加一致。
msgHistoryManager.addUserMessage({ | ||
content: message, | ||
content: message.replaceAll(LLM_CONTEXT_KEY_REGEX, ''), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
修复关键 Bug:防止将附加键发送给 LLM
使用正则表达式移除消息中的附加文件和文件夹标记,确保这些标记不会被发送给 LLM 处理,符合 PR 的主要目标。
这个修改解决了原始问题,确保 LLM 不会处理附加的标记文本。
🤖 Prompt for AI Agents (early access)
In packages/ai-native/src/browser/chat/chat.view.tsx at line 645, the content
field should be set by removing all occurrences of additional key markers from
the message using the LLM_CONTEXT_KEY_REGEX regular expression. This prevents
sending appended file and folder tags to the LLM, ensuring only the cleaned
message content is processed. Update the code to replace all matches of
LLM_CONTEXT_KEY_REGEX in the message with an empty string before assigning it to
content.
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #4545 +/- ##
=======================================
Coverage 52.88% 52.88%
=======================================
Files 1677 1677
Lines 103354 103358 +4
Branches 22381 22391 +10
=======================================
+ Hits 54657 54662 +5
+ Misses 40519 40516 -3
- Partials 8178 8180 +2
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
Types
Background or solution
引用 file 等内容时移除多余的文本结构
Changelog
not process attached key to LLM
Summary by CodeRabbit
新功能
样式
优化