Roll your own chat with just a few lines of code. Use any model / provider. Built upon AI Elements from Vercel, AI SDK. Supports MCP, MCP-UI, and the MCP Registry out of the box.
Use any AI SDK-compatible transport. (Comes with helpers to easily integrate OpenRouter, letting users control their own models and token usage.)
Run the following commands when starting fresh at the top-level of this repo.
bun installNow, let's set up your environment variables. I chose .env versus .env.local because of the way Bun reads environment variables.
cp examples/web/.env.example examples/web/.env
cp examples/server/.env.example examples/server/.envBuild the OpenChat component and then the example app.
bun run buildLet's run the example app.
bun run devThe npm package is coming soon. I'm thinking of it being @faith-tools/open-chat
import OpenChatComponent from "<package-name-tbd>";
function App() {
return (
<OpenChatComponent
openRouterModel="openai/gpt-5"
api="http://localhost:3000/api/chat"
placeholder="Ask me anything..."
className="w-full h-screen"
/>
);
}import OpenChatComponent from "<package-name-tbd>";
function SecureChat() {
return (
<OpenChatComponent
openRouterModel="openai/gpt-5"
api="http://localhost:3000/api/chat"
requireAuth={true} // Forces user to connect OpenRouter account
placeholder="Ask OpenChat..."
className="w-full h-96"
onError={(error) => console.error("Chat error:", error)}
/>
);
}import OpenChatComponent from "<package-name-tbd>";
function ChatWithTools() {
return (
<OpenChatComponent
openRouterModel="anthropic/claude-3-opus"
api="http://localhost:3000/api/chat"
tools={{
enabled: true,
mcpServers: [] // User can add servers via the UI dialog
}}
mcpRegistryUrl="https://registry.modelcontextprotocol.io"
className="w-full h-screen"
onNewMessage={(msg) => console.log("New message:", msg)}
/>
);
}import OpenChatComponent from "<package-name-tbd>";
import type { UIMessage } from '@ai-sdk/react';
function AdvancedChat() {
const initialMessages: UIMessage[] = [
{
id: "1",
role: "assistant",
content: [{ type: "text", text: "Hello! How can I help you today?" }]
}
];
return (
<OpenChatComponent
// Model configuration
openRouterModel="openai/gpt-5"
allowedModels={[ // Restrict model selection
"openai/gpt-5",
"anthropic/claude-3-opus",
"google/gemini-pro"
]}
// API configuration
api="http://localhost:3000/api/chat"
requireAuth={true}
// MCP Tools
tools={{
enabled: true,
mcpServers: []
}}
mcpRegistryUrl="https://registry.modelcontextprotocol.io"
// Chat configuration
threadId="unique-thread-123"
systemPrompt="You are a helpful AI assistant specialized in coding."
initialMessages={initialMessages}
placeholder="Ask about coding..."
// User profile
userProfile={{
name: "Developer",
chatPreferences: JSON.stringify({
preferredLanguage: "TypeScript",
codeStyle: "functional"
}),
avatarUrl: "https://example.com/avatar.png",
}}
// UI customization
className="w-full h-screen max-w-4xl mx-auto"
height="100vh"
theme="dark"
// Callbacks
onNewMessage={(msg) => {
console.log("New message:", msg);
// Save to database, analytics, etc.
}}
onSend={(text) => {
console.log("User sent:", text);
}}
onError={(error) => {
console.error("Error:", error);
// Handle errors appropriately
}}
// Custom message rendering (optional)
renderMessage={(message, part, index) => {
// Return null to use default rendering
// Or return custom JSX for specific message types
if (part.type === 'custom-type') {
return <div key={index}>Custom rendering for {part.type}</div>;
}
return null;
}}
>
{/* Optional footer content */}
<div className="p-2 text-center text-sm text-muted-foreground">
Powered by OpenRouter
</div>
</OpenChatComponent>
);
}Model selection is now fully controlled by the parent. Fetch any list of models (for example, from OpenRouter) and pass it into the component:
import OpenChatComponent, {
useOpenRouterModelOptions,
} from "<package-name-tbd>";
function ChatWithOpenRouterModels() {
const baseServerUrl = import.meta.env.VITE_SERVER_URL;
const { data: models, isLoading, error } = useOpenRouterModelOptions(baseServerUrl);
return (
<OpenChatComponent
api={`${baseServerUrl}/api/chat`}
requireAuth
models={models}
modelsLoading={isLoading}
modelsError={error ? (error instanceof Error ? error.message : String(error)) : undefined}
/>
);
}If you already have model data, simply map it into the ChatModelOption shape and pass it via the models prop.
Pass useChatOptions when you need full control over the underlying useChat hook—custom transports, headers, or a fixed model. When a model is supplied through these options, the dropdown hides automatically so end users cannot switch away.
import OpenChatComponent, {
useOpenRouterModelOptions,
} from "<package-name-tbd>";
import { DefaultChatTransport } from "ai";
<OpenChatComponent
api="http://localhost:3000/api/chat"
useChatOptions={{
transport: new DefaultChatTransport({
api: "https://proxy.example.com/chat",
headers: { Authorization: `Bearer ${token}` },
body: { model: "my-provider/small-model" },
}),
}}
/>To hide the selector explicitly, set the model in your transport or supply it in the request body. The UI detects the locked model and removes the picker.
⚠️ Security note: any headers, API keys, or bearer tokens defined inuseChatOptionslive in the browser. Use short-lived scoped tokens, rotate them frequently, or proxy requests through your own backend if you cannot trust the client environment.
OpenChatComponent no longer runs the OpenRouter OAuth dialog. Do the OAuth handshake yourself (hit /api/oauth/start, finish the redirect, stash the bearer wherever you trust it) and feed that token into useChatOptions.prepareSendMessagesRequest.
import { useEffect, useMemo, useState } from "react";
import OpenChatComponent from "<package-name-tbd>";
const baseServerUrl = import.meta.env.VITE_SERVER_URL;
const lockedModel = import.meta.env.VITE_LOCKED_MODEL?.trim();
async function exchangeToken() {
const { authUrl } = await fetch(`${baseServerUrl}/api/oauth/start`, {
method: "POST",
credentials: "include",
}).then((res) => res.json());
if (authUrl) {
window.location.href = authUrl; // external redirect flow
return null;
}
const { token } = await fetch(`${baseServerUrl}/api/oauth/token`, {
credentials: "include",
}).then((res) => res.json());
return token;
}
export function ChatShell() {
const [token, setToken] = useState<string | null>(null);
useEffect(() => {
exchangeToken().then(setToken).catch(console.error);
}, []);
const chatOptions = useMemo(() => {
if (!token) return undefined;
return {
prepareSendMessagesRequest: ({ body }: { body?: Record<string, unknown> }) => ({
headers: { Authorization: `Bearer ${token}` },
body: {
...(body ?? {}),
model: lockedModel ?? body?.model ?? "openai/gpt-5",
},
}),
};
}, [token]);
return (
<OpenChatComponent
api={`${baseServerUrl}/api/chat`}
openRouterModel={lockedModel ?? "openai/gpt-5"}
useChatOptions={chatOptions}
/>
);
}Use an env var like VITE_LOCKED_MODEL to hard clamp the model (same trick as the example app). The published demo in examples/web/src/routes/index.tsx shows wrapping the component so you can bake in defaults—copy that pattern if you want your own opinionated shell.
See the full TypeScript interface in apps/web/src/types/open-chat-component.ts for detailed prop documentation.
Key props include:
openRouterModel- Initial AI model to useallowedModels- Restrict available models for selectionapi- Backend API endpoint for chatrequireAuth- Force authentication before chattingtools- MCP server configurationsystemPrompt- System instructions for the AIthreadId- Unique thread identifieronNewMessage- Callback for new messagesrenderMessage- Custom message renderingtheme- Light/dark theme support
- Open Router support
- Get list of OpenRouter models based on user provider preferences
- MCP client support
- MCP Tools
- MCP Prompts
- MCP Resources
- Host this app
- Add info about Open Inference into the settings dialog
- Allow theme CSS variables for the component
- One-click MCP install
- Add your own MCP server
- MCP UI support
- Local LLM AI support (Transformers JS + AI SDK + built-in-ai)
- Personality Profile (tailor LLM to your preferences)
- Web component chatbot export
- Make it onto the official MCP Clients list
Here are some top-tier MCP registries I've been keeping my eyes on:
This project was created with Better-T-Stack.
For contributors: see AGENTS.md