Thanks to visit codestin.com
Credit goes to github.com

Skip to content

added aimlapi as provider #1752

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

OctavianTheI
Copy link

@OctavianTheI OctavianTheI commented Mar 27, 2025

Sup!
I'm from the integration team of AI/ML API
We'd like to join your amazing app as one of the providers.
We got 200+ models, including stuff by Openai, Qwen models, Llama's etc - and with no region lock, which is a pretty good benefit for some folks.
Anyways - please check my PR, and let me know if anything needs to be adjusted :)


Important

Added AI/ML API as a new provider, updating documentation and components to support its integration.

  • Provider Addition:
    • Added AI/ML API as a new provider in ModelPicker.tsx with model Llama 3.3 70B Instruct Turbo.
    • Updated README.md and packages/react/README.md to include AI/ML API in the list of supported providers.
  • Documentation Updates:
    • Updated getting-started.mdx to include installation and setup instructions for AI/ML API.
    • Modified rsc.mdx, use-chat-hook.mdx, and use-chat.mdx to list AI/ML API as a supported integration.
  • Environment Setup:
    • Added AI/ML API environment variable setup in getting-started.mdx.

This description was created by Ellipsis for 387e35e. It will automatically update as commits are pushed.

Copy link

vercel bot commented Mar 27, 2025

@OctavianTheI is attempting to deploy a commit to the assistant-ui Team on Vercel.

A member of the Team first needs to authorize it.

Copy link
Contributor

LGTM 👍

Copy link
Contributor

@greptile-apps greptile-apps bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

PR Summary

This PR integrates AI/ML API as a new provider across documentation and components, ensuring consistency with the existing naming conventions and assets.

• Updated apps/docs/content/docs/runtimes/ai-sdk/rsc.mdx to include AI/ML API among supported providers.
• Modified README.md to list AI/ML API in the supported model providers section.
• Added a new model entry and corresponding AIML API SVG asset in apps/docs/components/shadcn/ModelPicker.tsx.
• Updated use-chat.mdx and use-chat-hook.mdx to reflect AI/ML API integration.
• Enhanced apps/docs/content/docs/getting-started.mdx with AI/ML API setup instructions, noting a potential syntax issue with unquoted environment variable identifiers.

💡 (1/5) You can manually trigger the bot by mentioning @greptileai in a comment!

7 file(s) reviewed, 1 comment(s)
Edit PR Review Bot Settings | Greptile

</Tabs>

Define environment variables:

<Tabs id="provider" items={["OpenAI", "Anthropic", "Azure", "AWS", "Gemini", "GCP", "Groq", "Fireworks", "Cohere", "Ollama", "Chrome AI"]}>
<Tabs id="provider" items={["OpenAI", "Anthropic", "Azure", "AWS", "Gemini", "GCP", "Groq", "Fireworks", "Cohere", "Ollama", "Chrome AI", AI/ML API]}>
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

syntax: Wrap 'AI/ML API' in quotes in the Tabs items for environment variables.

Suggested change
<Tabs id="provider" items={["OpenAI", "Anthropic", "Azure", "AWS", "Gemini", "GCP", "Groq", "Fireworks", "Cohere", "Ollama", "Chrome AI", AI/ML API]}>
<Tabs id="provider" items={["OpenAI", "Anthropic", "Azure", "AWS", "Gemini", "GCP", "Groq", "Fireworks", "Cohere", "Ollama", "Chrome AI", "AI/ML API"]}>

Copy link
Contributor

promptless bot commented Mar 27, 2025

📝 Documentation updates detected! You can review documentation updates here

</Tabs>

Define environment variables:

<Tabs id="provider" items={["OpenAI", "Anthropic", "Azure", "AWS", "Gemini", "GCP", "Groq", "Fireworks", "Cohere", "Ollama", "Chrome AI"]}>
<Tabs id="provider" items={["OpenAI", "Anthropic", "Azure", "AWS", "Gemini", "GCP", "Groq", "Fireworks", "Cohere", "Ollama", "Chrome AI", AI/ML API]}>
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Wrap 'AI/ML API' in quotes in the Tabs items array to maintain valid syntax.

Suggested change
<Tabs id="provider" items={["OpenAI", "Anthropic", "Azure", "AWS", "Gemini", "GCP", "Groq", "Fireworks", "Cohere", "Ollama", "Chrome AI", AI/ML API]}>
<Tabs id="provider" items={["OpenAI", "Anthropic", "Azure", "AWS", "Gemini", "GCP", "Groq", "Fireworks", "Cohere", "Ollama", "Chrome AI", "AI/ML API"]}>

Copy link
Contributor

coderabbitai bot commented Mar 27, 2025

Walkthrough

The update incorporates the "AI/ML API" integration into the assistant-ui library. Across documentation and code components, references to supported model providers have been revised to include this new API alongside existing services. In the ModelPicker component, a new model entry, "Llama 3.3 70B Instruct Turbo," is added with its corresponding icon. The changes extend to the getting-started documentation, where terminal commands, API endpoint implementations, and environment variable setups now account for the "AI/ML API" provider. Additionally, various sections in the runtime and hook documentation have been updated to reflect the expanded list of integrations. No modifications were made to the declarations of exported or public entities.


📜 Recent review details

Configuration used: CodeRabbit UI
Review profile: ASSERTIVE
Plan: Lite

📥 Commits

Reviewing files that changed from the base of the PR and between 387e35e and fbe168a.

📒 Files selected for processing (1)
  • apps/docs/content/docs/getting-started.mdx (4 hunks)
🔇 Additional comments (5)
apps/docs/content/docs/getting-started.mdx (5)

73-73: Provider Tabs Definition: Provider Name Quoting

The Tabs component now correctly lists "AI/ML API" (properly wrapped in quotes) alongside the other providers. This addresses previous concerns about unquoted strings in the provider list.


119-121: Installation Command Consistency Check

The terminal installation command for the AI/ML API provider appears to use the same package as the OpenAI integration (@ai-sdk/openai). Please confirm that this is intentional and that the AI/ML API service indeed uses the same SDK module. If a different package should be used, update this command accordingly.


315-334: New API Endpoint Integration for AI/ML API

This new code block correctly sets up an API endpoint for the AI/ML API provider. The configuration (using createOpenAI with a custom baseURL and the new environment variable AIMLAPI_API_KEY) looks well integrated.

  • Please verify that the baseURL (https://api.aimlapi.com/v1) and the model identifier (accounts/aimlapi/models/meta-llama/Llama-3.3-70B-Instruct-Turbo) are correct.
  • Also, confirm that using the OpenAI SDK (@ai-sdk/openai) for this provider is the expected design.

340-342: Environment Variables Tab - Provider List Consistency

The Tabs component used to list environment variables now includes "AI/ML API", which is consistent with the installation and API endpoint sections. This ensures that the new provider is clearly visible throughout the documentation.


389-391: New Environment Variable for AI/ML API

The new code block defining the AIMLAPI_API_KEY for the AI/ML API provider is clear and correctly formatted. Ensure that documentation or setup instructions elsewhere in the project remind users to secure this key properly.


🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Generate unit testing code for this file.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai generate unit testing code for this file.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read src/utils.ts and generate unit testing code.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    • @coderabbitai help me debug CodeRabbit configuration file.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai plan to trigger planning for file edits and PR creation.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

CodeRabbit Configuration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • Please see the configuration documentation for more information.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

📜 Review details

Configuration used: CodeRabbit UI
Review profile: ASSERTIVE
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between dc8e1f2 and 387e35e.

⛔ Files ignored due to path filters (1)
  • apps/docs/assets/providers/aimlapi.svg is excluded by !**/*.svg
📒 Files selected for processing (7)
  • README.md (1 hunks)
  • apps/docs/components/shadcn/ModelPicker.tsx (2 hunks)
  • apps/docs/content/docs/getting-started.mdx (4 hunks)
  • apps/docs/content/docs/runtimes/ai-sdk/rsc.mdx (1 hunks)
  • apps/docs/content/docs/runtimes/ai-sdk/use-chat-hook.mdx (1 hunks)
  • apps/docs/content/docs/runtimes/ai-sdk/use-chat.mdx (1 hunks)
  • packages/react/README.md (1 hunks)
🔇 Additional comments (10)
apps/docs/content/docs/runtimes/ai-sdk/use-chat.mdx (1)

8-8: Documentation updated with new provider support.

The Vercel AI SDK UI's useChat hook documentation now includes "AI/ML API" in the list of supported integrations.

apps/docs/content/docs/runtimes/ai-sdk/rsc.mdx (1)

8-8: Documentation updated with new provider support.

The Vercel AI SDK RSC integration documentation now includes "AI/ML API" in the supported integration list, maintaining consistency with other documentation changes.

packages/react/README.md (1)

26-26: Documentation updated with new provider support.

The packages/react README now includes "AI/ML API" in the list of supported model providers, keeping documentation consistent with the main README.

README.md (1)

26-26: Documentation updated with new provider support.

The main README now includes "AI/ML API" in the supported model providers list, reflecting the new integration.

apps/docs/components/shadcn/ModelPicker.tsx (2)

7-7: New provider icon import added correctly.

The import for the AI/ML API provider icon has been added appropriately.


52-56:

✅ Verification successful

New model added to the ModelPicker component.

The Llama 3.3 70B Instruct Turbo model has been successfully added to the ModelPicker with the correct provider icon.


🏁 Script executed:

#!/bin/bash
# Verify that the aimlapi.svg file exists
fd -t f "aimlapi.svg" assets/providers

Length of output: 142


Please verify the location of the aimlapi provider icon.
The new "Llama 3.3 70B Instruct Turbo" model has been added to the ModelPicker as intended. However, our initial verification using the path assets/providers resulted in an error, suggesting that the directory might be incorrect or that the file exists elsewhere. Kindly ensure that the file aimlapi.svg is present in the correct location (or use a repository-wide search, such as the script above, to confirm its presence).


🏁 Script executed:

#!/bin/bash
# Search for the aimlapi.svg file across the entire repository without restricting to a specific path.
fd -t f "aimlapi.svg"

Length of output: 182


New model integration verified in ModelPicker

  • The "Llama 3.3 70B Instruct Turbo" entry is correctly added in the ModelPicker component.
  • Verification confirmed that the aimlapi.svg file is present at apps/docs/assets/providers/aimlapi.svg, ensuring that the correct provider icon is in use.
apps/docs/content/docs/runtimes/ai-sdk/use-chat-hook.mdx (1)

8-8: Inclusion of AI/ML API in the integrations list
The updated sentence now includes "AI/ML API" alongside the other supported providers. This update aligns with the PR objectives and is clearly communicated.

apps/docs/content/docs/getting-started.mdx (3)

73-73: Updated Provider Tabs: Added "AI/ML API"
The Tabs component’s items array now includes "AI/ML API" along with the other providers. This change ensures that users see the new integration option in the installation instructions.


119-121: "AI/ML API" Terminal Installation Command
The Terminal command block for the "AI/ML API" tab installs the same package (@ai-sdk/openai) as used for OpenAI. This appears intentional—using the underlying OpenAI adapter with a different configuration. Just verify that this reuse is on purpose for the AIML API integration.


315-334: API Endpoint Implementation for "AI/ML API"
The new API endpoint code block correctly configures the AIML API integration using createOpenAI with a custom baseURL and the environment variable AIMLAPI_API_KEY. The model identifier "accounts/aimlapi/models/meta-llama/Llama-3.3-70B-Instruct-Turbo" is clearly specified.
Consider adding an inline comment (if desired) to highlight that this is the new integration for clarity in future maintenance.

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>
Comment on lines 337 to +340

Define environment variables:

<Tabs id="provider" items={["OpenAI", "Anthropic", "Azure", "AWS", "Gemini", "GCP", "Groq", "Fireworks", "Cohere", "Ollama", "Chrome AI"]}>

```sh title="/.env.local" tab="OpenAI"
<Tabs id="provider" items={["OpenAI", "Anthropic", "Azure", "AWS", "Gemini", "GCP", "Groq", "Fireworks", "Cohere", "Ollama", "Chrome AI", "AI/ML API"]}>
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed a syntax error in the Tabs component where the AI/ML API item was missing quotes, which would cause a parsing error.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fixed this ☝️

Copy link

trag-bot bot commented Mar 31, 2025

Pull request summary

  • Added support for a new AI/ML API provider in the documentation and codebase.
  • Updated the README to include the new AI/ML API in the list of supported model providers.
  • Introduced a new SVG asset for the AI/ML API provider.
  • Modified the ModelPicker component to include the new AI/ML API option.
  • Enhanced the installation instructions to include the new AI/ML API SDK.
  • Updated the API endpoint documentation to demonstrate how to use the new AI/ML API.
  • Added environment variable configuration for the new AI/ML API key.
  • Revised multiple documentation files to reflect the inclusion of the AI/ML API in various integrations.
  • Ensured that the new provider is integrated into existing components without breaking functionality.

@OctavianTheI
Copy link
Author

hi, guys!
Could you give an approval for the workflow here?
Implemented the requested changes a while ago, should be fine now

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant