Thanks to visit codestin.com
Credit goes to github.com

Skip to content

feat: Add ModelsLab LLM provider#4243

Open
adhikjoshi wants to merge 1 commit intobrowser-use:mainfrom
adhikjoshi:feat/modelslab-llm-provider
Open

feat: Add ModelsLab LLM provider#4243
adhikjoshi wants to merge 1 commit intobrowser-use:mainfrom
adhikjoshi:feat/modelslab-llm-provider

Conversation

@adhikjoshi
Copy link

@adhikjoshi adhikjoshi commented Mar 1, 2026

Summary

Adds ModelsLab as an LLM provider for browser-use, following the OpenRouter provider pattern.

What this PR adds

  • ChatModelsLab class implementing the BaseChatModel protocol
  • Uses ModelsLab's OpenAI-compatible chat API: https://modelslab.com/api/v6/llm
  • Supports all browser-use features: streaming, structured output, tool calling
  • Models available: llama-3-8b, llama-3-70b, mistral-7b, mixtral-8x7b, and more
  • 5 unit tests (all passing)

Why ModelsLab

ModelsLab provides open-source LLMs (Llama, Mistral, Mixtral, etc.) at competitive pricing, including uncensored models for use cases requiring less restrictive content policies.

API docs

https://docs.modelslab.com

Usage

from browser_use.llm.modelslab.chat import ChatModelsLab
import os

llm = ChatModelsLab(
    model='llama-3-70b-chat',
    api_key=os.getenv('MODELSLAB_API_KEY'),
)

Testing

python -m pytest browser_use/llm/tests/test_modelslab_chat.py -v

Checklist


Summary by cubic

Adds ModelsLab as an OpenAI-compatible LLM provider for browser-use to expand model options and support structured output, tools, and streaming. Follows the existing provider pattern with tests and an example.

  • New Features

    • New ChatModelsLab provider using https://modelslab.com/api/v6/llm
    • Supports streaming, tool calling, and JSON-schema structured output
    • Follows OpenRouter provider pattern; no new dependencies
    • Models include llama-3-8b/70b, mistral-7b, mixtral-8x7b, and more
    • Added mocked tests and an example script
  • Migration

    • Set MODELSLAB_API_KEY in the environment (optional: override base_url)

Written for commit 493f02d. Summary will update on new commits.

- Adds ChatModelsLab class implementing BaseChatModel protocol
- Uses ModelsLab's OpenAI-compatible chat API (base_url: https://modelslab.com/api/v6/llm)
- Supports structured output via response_format
- Follows OpenRouter provider pattern with minimal changes
- Includes tests (5 passing) and example usage
- Set MODELSLAB_API_KEY env var to use

API docs: https://docs.modelslab.com
@CLAassistant
Copy link

CLA assistant check
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you sign our Contributor License Agreement before we can accept your contribution.
You have signed the CLA already but the status is still pending? Let us recheck it.

Copy link
Contributor

@cubic-dev-ai cubic-dev-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

1 issue found across 5 files

Prompt for AI agents (unresolved issues)

Check if these issues are valid — if so, understand the root cause of each and fix them. If appropriate, use sub-agents to investigate and fix each issue separately.


<file name="browser_use/llm/__init__.py">

<violation number="1" location="browser_use/llm/__init__.py:39">
P2: ChatModelsLab is missing from `_LAZY_IMPORTS` and `__all__`, preventing proper import access. The new ModelsLab provider cannot be imported via `from browser_use.llm import ChatModelsLab` or star imports because it was only added to TYPE_CHECKING imports but not registered in the lazy import system.</violation>
</file>

Reply with feedback, questions, or to request a fix. Tag @cubic-dev-ai to re-run a review.

from browser_use.llm.google.chat import ChatGoogle
from browser_use.llm.groq.chat import ChatGroq
from browser_use.llm.mistral.chat import ChatMistral
from browser_use.llm.modelslab.chat import ChatModelsLab
Copy link
Contributor

@cubic-dev-ai cubic-dev-ai bot Mar 1, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2: ChatModelsLab is missing from _LAZY_IMPORTS and __all__, preventing proper import access. The new ModelsLab provider cannot be imported via from browser_use.llm import ChatModelsLab or star imports because it was only added to TYPE_CHECKING imports but not registered in the lazy import system.

Prompt for AI agents
Check if this issue is valid — if so, understand the root cause and fix it. At browser_use/llm/__init__.py, line 39:

<comment>ChatModelsLab is missing from `_LAZY_IMPORTS` and `__all__`, preventing proper import access. The new ModelsLab provider cannot be imported via `from browser_use.llm import ChatModelsLab` or star imports because it was only added to TYPE_CHECKING imports but not registered in the lazy import system.</comment>

<file context>
@@ -36,6 +36,7 @@
 	from browser_use.llm.google.chat import ChatGoogle
 	from browser_use.llm.groq.chat import ChatGroq
 	from browser_use.llm.mistral.chat import ChatMistral
+	from browser_use.llm.modelslab.chat import ChatModelsLab
 	from browser_use.llm.oci_raw.chat import ChatOCIRaw
 	from browser_use.llm.ollama.chat import ChatOllama
</file context>
Fix with Cubic

@adhikjoshi
Copy link
Author

I have read the CLA Document and I hereby sign the CLA

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants