Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Conversation

@mukundkumarjha
Copy link
Contributor

Fixes #1774

This PR adds support for context manager usage of LLMConfig in components outside of agents. Specifically:

  • Added support in BrowserUseTool
  • Added support in AgentOptimizer

Changes:

  • Modified BrowserUseTool.init to accept llm_config as None and fetch from context if not provided
  • Modified AgentOptimizer.init to accept llm_config as None and fetch from context if not provided
  • Updated documentation to reflect the new usage

This allows these components to be used with the with llm_config: syntax, similar to how it's used in agents.

Example usage:

with llm_config:

@CLAassistant
Copy link

CLAassistant commented May 5, 2025

CLA assistant check
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you all sign our Contributor License Agreement before we can accept your contribution.
4 out of 5 committers have signed the CLA.

✅ mukundkumarjha
✅ qingyun-wu
✅ marklysze
✅ kumaranvpl
❌ mj2472
You have signed the CLA already but the status is still pending? Let us recheck it.

@harishmohanraj
Copy link
Collaborator

@mukundkumarjha: Could you please sign the CLA.

@mukundkumarjha
Copy link
Contributor Author

@harishmohanraj Done

@harishmohanraj
Copy link
Collaborator

Could you please double-check the CLA? The badge still shows as "not signed yet."

In the meantime, I had a quick look at your changes. Could you please add or update the relevant tests? You can refer to the following test files:

  • test/agentchat/contrib/test_agent_optimizer.py
  • test/tools/experimental/browser_use/test_browser_use.py

@mukundkumarjha
Copy link
Contributor Author

Could you please double-check the CLA? The badge still shows as "not signed yet."

In the meantime, I had a quick look at your changes. Could you please add or update the relevant tests? You can refer to the following test files:

  • test/agentchat/contrib/test_agent_optimizer.py
  • test/tools/experimental/browser_use/test_browser_use.py

You have signed the CLA already but the status is still pending? Let us recheck it.

this is the message what I can see, when trying to recheck I am begin redirected to the same page. :(

@mukundkumarjha mukundkumarjha requested a review from kumaranvpl May 6, 2025 05:02
@qingyun-wu
Copy link
Collaborator

@mukundkumarjha could you check again on CLA? Thanks for the contribution! Nice work!

@mukundkumarjha
Copy link
Contributor Author

@qingyun-wu Thanks for your kind words.

Screenshot 2025-06-05 at 9 27 47 AM

I have already signed Cla I have attached ss for reference.

@qingyun-wu
Copy link
Collaborator

Thanks @mukundkumarjha! One thing you need to do:

Several PR checks are failing. Check the error msgs to fix. Shouldn't be too difficult to fix. You can check our contributing guide for more detailed instructions: https://docs.ag2.ai/latest/docs/contributor-guide/pre-commit/

@marklysze Please help if @mukundkumarjha has difficulty in resolving the issues.

Thanks both!

@qingyun-wu
Copy link
Collaborator

@mukundkumarjha, can I have your email? Thanks!

@mukundkumarjha
Copy link
Contributor Author

mukundkumarjha commented Jun 5, 2025

@mukundkumarjha, can I have your email? Thanks!

sure, [email protected]

@marklysze
Copy link
Collaborator

@mukundkumarjha would you be able to give me access to write to your fork and I'll help with the fixes? @marklysze is my username

@mukundkumarjha
Copy link
Contributor Author

Done.

@marklysze
Copy link
Collaborator

@mj2472 can you please sign the CLA

@marklysze
Copy link
Collaborator

Done.

Thanks @mukundkumarjha, I've made an update to use LLMConfig.current instead of LLMConfig.current() as current is a property and not a method. I've added some tests as well.

Can you test it yourself and make sure it's working as you expect?

@qingyun-wu qingyun-wu merged commit becaa4e into ag2ai:main Jun 20, 2025
11 of 12 checks passed
@codecov
Copy link

codecov bot commented Jun 20, 2025

Codecov Report

Attention: Patch coverage is 0% with 2 lines in your changes missing coverage. Please review.

Files with missing lines Patch % Lines
...ogen/tools/experimental/browser_use/browser_use.py 0.00% 2 Missing ⚠️

❗ There is a different number of reports uploaded between BASE (ca47abd) and HEAD (bbfa615). Click for more details.

HEAD has 1241 uploads less than BASE
Flag BASE (ca47abd) HEAD (bbfa615)
3.9 84 0
ubuntu-latest 122 1
commsagent-discord 9 0
optional-deps 171 0
commsagent-slack 9 0
3.10 103 0
3.13 91 0
commsagent-telegram 9 0
browser-use 3 0
core-without-llm 14 1
3.11 42 1
macos-latest 112 0
3.12 32 0
windows-latest 118 0
twilio 9 0
interop 9 0
retrievechat-mongodb 10 0
crawl4ai 9 0
jupyter-executor 9 0
retrievechat-pgvector 10 0
graph-rag-falkor-db 6 0
rag 7 0
retrievechat-qdrant 14 0
retrievechat 15 0
gpt-assistant-agent 3 0
agent-eval 1 0
lmm 3 0
teachable 3 0
gemini 14 0
ollama 14 0
retrievechat-couchbase 3 0
interop-crewai 8 0
websockets 8 0
google-api 9 0
mcp 9 0
wikipedia-api 7 0
interop-pydantic-ai 9 0
cerebras 13 0
docs 6 0
interop-langchain 9 0
llama-index-agent 3 0
long-context 3 0
together 14 0
groq 11 0
bedrock 10 0
cohere 15 0
swarm 11 0
anthropic 15 0
websurfer 12 0
mistral 14 0
Files with missing lines Coverage Δ
...ogen/tools/experimental/browser_use/browser_use.py 51.72% <0.00%> (-16.14%) ⬇️

... and 57 files with indirect coverage changes

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.
  • 📦 JS Bundle Analysis: Save yourself from yourself by tracking and limiting bundle sizes in JS merges.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Issue]: Support with syntax for all LLMConfig usages

7 participants