Thanks to visit codestin.com
Credit goes to github.com

Skip to content
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 4 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -137,6 +137,10 @@ npx @mcpjam/inspector@latest

In the UI "MCP Servers" tab, click add server, select HTTP, then paste in your server URL. Support for OAuth 2.0 testing.

## Amazon Bedrock Support

Connect MCPJam Inspector to Amazon Bedrock by adding your credentials in **Settings → LLM Provider API Keys**. Enter the values as `accessKeyId|secretAccessKey[|sessionToken]`; they are stored locally in your browser. Set the optional `AWS_BEDROCK_REGION` environment variable on the server to override the default `us-west-2` region. Once configured you can chat with Amazon Titan Text Lite, Claude 3.5 Sonnet (Bedrock), and Llama 3.3 70B Instruct.

## Requirements

[![Node.js](https://img.shields.io/badge/Node.js-20+-green.svg?style=for-the-badge&logo=node.js)](https://nodejs.org/)
Expand Down
13 changes: 13 additions & 0 deletions client/public/bedrock_logo.svg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
11 changes: 11 additions & 0 deletions client/src/components/SettingsTab.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -87,6 +87,17 @@ export function SettingsTab() {
placeholder: "...",
getApiKeyUrl: "https://console.mistral.ai/api-keys/",
},
{
id: "bedrock",
name: "Amazon Bedrock",
logo: "/bedrock_logo.svg",
logoAlt: "Amazon Bedrock",
description:
"Amazon Titan, Claude 3.5 Sonnet (Bedrock), Llama 3.3 70B via AWS Bedrock.",
placeholder: "AKIA...|wJalrX...",
getApiKeyUrl:
"https://console.aws.amazon.com/iam/home#/security_credentials",
},
];

const handleEdit = (providerId: string) => {
Expand Down
5 changes: 5 additions & 0 deletions client/src/components/chat/chat-helpers.ts
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,7 @@ import litellmLogo from "/litellm_logo.png";
import moonshotLightLogo from "/moonshot_light.png";
import moonshotDarkLogo from "/moonshot_dark.png";
import zAiLogo from "/z-ai.png";
import bedrockLogo from "/bedrock_logo.svg";

export const getProviderLogoFromProvider = (
provider: string,
Expand All @@ -29,6 +30,8 @@ export const getProviderLogoFromProvider = (
return googleLogo;
case "mistral":
return mistralLogo;
case "bedrock":
return bedrockLogo;
case "ollama":
// Return dark logo when in dark mode
if (themeMode === "dark") {
Expand Down Expand Up @@ -89,6 +92,8 @@ export const getProviderColor = (provider: string) => {
return "text-red-600 dark:text-red-400";
case "mistral":
return "text-orange-500 dark:text-orange-400";
case "bedrock":
return "text-amber-600 dark:text-amber-400";
case "ollama":
return "text-gray-600 dark:text-gray-400";
case "x-ai":
Expand Down
1 change: 1 addition & 0 deletions client/src/components/chat/mcpjam-model-selector.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -39,6 +39,7 @@ const PROVIDER_DISPLAY_NAME: Partial<Record<string, string>> = {
moonshotai: "Moonshot AI",
"z-ai": "Zhipu AI",
mistral: "Mistral AI",
bedrock: "Amazon Bedrock",
};

function toDisplayName(
Expand Down
2 changes: 2 additions & 0 deletions client/src/components/chat/model-selector.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -54,6 +54,8 @@ const getProviderDisplayName = (provider: ModelProvider): string => {
return "Google AI";
case "mistral":
return "Mistral AI";
case "bedrock":
return "Amazon Bedrock";
case "ollama":
return "Ollama";
case "meta":
Expand Down
21 changes: 21 additions & 0 deletions client/src/components/setting/ProviderConfigDialog.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -119,6 +119,27 @@ export function ProviderConfigDialog({
</AlertDescription>
</Alert>
)}
{provider?.id === "bedrock" && (
<Alert>
<AlertDescription>
<p>
Enter your credentials as
<code className="mx-1 rounded bg-muted px-1 py-[1px] text-xs">
accessKeyId|secretAccessKey[|sessionToken]
</code>
. We store them locally in your browser. Optionally set the
<code className="mx-1 rounded bg-muted px-1 py-[1px] text-xs">
AWS_BEDROCK_REGION
</code>
environment variable on the server (defaults to
<code className="mx-1 rounded bg-muted px-1 py-[1px] text-xs">
us-west-2
</code>
).
</p>
</AlertDescription>
</Alert>
)}
</div>

<DialogFooter>
Expand Down
7 changes: 6 additions & 1 deletion client/src/hooks/use-ai-provider-keys.ts
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@ export interface ProviderTokens {
deepseek: string;
google: string;
mistral: string;
bedrock: string;
ollama: string;
ollamaBaseUrl: string;
litellm: string;
Expand Down Expand Up @@ -36,6 +37,7 @@ const defaultTokens: ProviderTokens = {
deepseek: "",
google: "",
mistral: "",
bedrock: "",
ollama: "local", // Ollama runs locally, no API key needed
ollamaBaseUrl: "http://localhost:11434/api",
litellm: "", // LiteLLM API key (optional, depends on proxy setup)
Expand All @@ -54,7 +56,10 @@ export function useAiProviderKeys(): useAiProviderKeysReturn {
const stored = localStorage.getItem(STORAGE_KEY);
if (stored) {
const parsedTokens = JSON.parse(stored) as ProviderTokens;
setTokens(parsedTokens);
setTokens({
...defaultTokens,
...parsedTokens,
});
}
} catch (error) {
console.warn(
Expand Down
1 change: 1 addition & 0 deletions client/src/hooks/use-chat.ts
Original file line number Diff line number Diff line change
Expand Up @@ -170,6 +170,7 @@ export function useChat(options: UseChatOptions = {}) {
deepseek: hasToken("deepseek"),
google: hasToken("google"),
mistral: hasToken("mistral"),
bedrock: hasToken("bedrock"),
ollama: isOllamaRunning,
litellm: Boolean(getLiteLLMBaseUrl() && getLiteLLMModelAlias()),
meta: false,
Expand Down
2 changes: 2 additions & 0 deletions client/src/lib/chat-utils.ts
Original file line number Diff line number Diff line change
Expand Up @@ -167,6 +167,8 @@ export function getDefaultTemperatureByProvider(provider: string): number {
return 0.9; // Google's recommended default
case "mistral":
return 0.7; // Mistral's recommended default
case "bedrock":
return 0;
default:
return 0;
}
Expand Down
6 changes: 4 additions & 2 deletions docs/contributing/onboarding.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -121,6 +121,7 @@ MCPJam supports multiple LLM providers:
- OpenAI (GPT-3.5, GPT-4)
- Anthropic (Claude 2, Claude 3)
- DeepSeek (DeepSeek R1)
- Amazon Bedrock (Titan, Claude, Llama via AWS)
- Ollama (Local models)

**Key file:** `client/src/lib/llm-providers.ts`
Expand All @@ -141,8 +142,9 @@ MCPJam supports multiple LLM providers:

<Step title="Set Environment Variables">
Create a `.env` file in the root directory with your API keys: ```bash
OPENAI_API_KEY=your_key_here ANTHROPIC_API_KEY=your_key_here # Optional: Add
other provider keys ```
OPENAI_API_KEY=your_key_here ANTHROPIC_API_KEY=your_key_here \
BEDROCK_CREDENTIALS="accessKeyId|secretAccessKey" \
AWS_BEDROCK_REGION=us-west-2 # Optional: Add other provider keys ```
</Step>

<Step title="Start Development Server">
Expand Down
3 changes: 2 additions & 1 deletion docs/evals/overview.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -79,7 +79,8 @@ This file is configured very similar to a `mcp.json` file. You must provide at l
"providerApiKeys": {
"anthropic": "${ANTHROPIC_API_KEY}",
"openai": "${OPENAI_API_KEY}",
"deepseek": "${DEEPSEEK_API_KEY}"
"deepseek": "${DEEPSEEK_API_KEY}",
"bedrock": "${BEDROCK_CREDENTIALS}"
}
}
```
Expand Down
6 changes: 6 additions & 0 deletions docs/inspector/llm-playground.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -55,6 +55,12 @@ Get an API key from [Mistral AI Console](https://console.mistral.ai/api-keys/)

`mistral-large-latest`, `mistral-small-latest`, `codestral-latest`, `ministral-8b-latest`, `ministral-3b-latest`

### Amazon Bedrock

Create an IAM user with Bedrock permissions and generate an access key in the [AWS console](https://console.aws.amazon.com/iam/home#/security_credentials). In MCPJam enter your credentials as `accessKeyId|secretAccessKey[|sessionToken]` from the Settings tab. You can optionally set `AWS_BEDROCK_REGION` (defaults to `us-west-2`).

`amazon.titan-text-lite-v1`, `anthropic.claude-3-5-sonnet-20241022-v2:0`, `meta.llama3-3-70b-instruct-v1:0`

### Ollama

Make sure you have [Ollama installed](https://ollama.com/), and the MCPJam Ollama URL configuration is pointing to your Ollama instance.
Expand Down
Loading
Loading