feat: Add ModelsLab LLM provider#4243
Conversation
- Adds ChatModelsLab class implementing BaseChatModel protocol - Uses ModelsLab's OpenAI-compatible chat API (base_url: https://modelslab.com/api/v6/llm) - Supports structured output via response_format - Follows OpenRouter provider pattern with minimal changes - Includes tests (5 passing) and example usage - Set MODELSLAB_API_KEY env var to use API docs: https://docs.modelslab.com
|
|
There was a problem hiding this comment.
1 issue found across 5 files
Prompt for AI agents (unresolved issues)
Check if these issues are valid — if so, understand the root cause of each and fix them. If appropriate, use sub-agents to investigate and fix each issue separately.
<file name="browser_use/llm/__init__.py">
<violation number="1" location="browser_use/llm/__init__.py:39">
P2: ChatModelsLab is missing from `_LAZY_IMPORTS` and `__all__`, preventing proper import access. The new ModelsLab provider cannot be imported via `from browser_use.llm import ChatModelsLab` or star imports because it was only added to TYPE_CHECKING imports but not registered in the lazy import system.</violation>
</file>
Reply with feedback, questions, or to request a fix. Tag @cubic-dev-ai to re-run a review.
| from browser_use.llm.google.chat import ChatGoogle | ||
| from browser_use.llm.groq.chat import ChatGroq | ||
| from browser_use.llm.mistral.chat import ChatMistral | ||
| from browser_use.llm.modelslab.chat import ChatModelsLab |
There was a problem hiding this comment.
P2: ChatModelsLab is missing from _LAZY_IMPORTS and __all__, preventing proper import access. The new ModelsLab provider cannot be imported via from browser_use.llm import ChatModelsLab or star imports because it was only added to TYPE_CHECKING imports but not registered in the lazy import system.
Prompt for AI agents
Check if this issue is valid — if so, understand the root cause and fix it. At browser_use/llm/__init__.py, line 39:
<comment>ChatModelsLab is missing from `_LAZY_IMPORTS` and `__all__`, preventing proper import access. The new ModelsLab provider cannot be imported via `from browser_use.llm import ChatModelsLab` or star imports because it was only added to TYPE_CHECKING imports but not registered in the lazy import system.</comment>
<file context>
@@ -36,6 +36,7 @@
from browser_use.llm.google.chat import ChatGoogle
from browser_use.llm.groq.chat import ChatGroq
from browser_use.llm.mistral.chat import ChatMistral
+ from browser_use.llm.modelslab.chat import ChatModelsLab
from browser_use.llm.oci_raw.chat import ChatOCIRaw
from browser_use.llm.ollama.chat import ChatOllama
</file context>
|
I have read the CLA Document and I hereby sign the CLA |
Summary
Adds ModelsLab as an LLM provider for browser-use, following the OpenRouter provider pattern.
What this PR adds
ChatModelsLabclass implementing theBaseChatModelprotocolhttps://modelslab.com/api/v6/llmWhy ModelsLab
ModelsLab provides open-source LLMs (Llama, Mistral, Mixtral, etc.) at competitive pricing, including uncensored models for use cases requiring less restrictive content policies.
API docs
https://docs.modelslab.com
Usage
Testing
Checklist
Summary by cubic
Adds ModelsLab as an OpenAI-compatible LLM provider for browser-use to expand model options and support structured output, tools, and streaming. Follows the existing provider pattern with tests and an example.
New Features
Migration
Written for commit 493f02d. Summary will update on new commits.