Thanks to visit codestin.com
Credit goes to github.com

Skip to content

varlabz/ask

Repository files navigation

ASK Agent Swiss Knife

PydanticAI-powered CLI agent with Model Context Protocol (MCP) server support

ASK is a versatile AI agent that works both as a CLI tool with MCP server integrations and as an MCP server itself to extend other LLMs like Claude and VS Code Copilot.

Features

  • Multi-provider LLM support: OpenAI, Ollama, OpenRouter, LMStudio, Google, Anthropic or any OpenAI compatible model API
  • MCP server integration: stdio, SSE, HTTP transports
  • Dual mode operation: CLI agent + MCP server
  • Rich tool ecosystem: Web search, file ops, memory, YouTube transcripts, any MCP tool
  • Environment variable support: Secure API key management
  • YAML configuration: Simple, readable config format

Quick Usage with uvx

Use ASK directly without installation with with a simple config:

# Create minimal config
echo "agent:
  instructions: 'You are a helpful AI assistant.'
llm:
  model: 'openai:gpt-5-nano'
  api_key: '<your-api-key>'" > .ask.yaml

# Run with uvx
uvx --from git+https://github.com/varlabz/ask ask-cli -c .ask.yaml "Explain machine learning"

Configuration

Example of advanced configuration

Create a agent.yaml file:

agent:
  instructions: "You are a helpful AI assistant with access to web search and file operations."

llm:      # for openai compatible models
  model: openai:deepseek-chat
  api_key: file:~/.config/ask/deepseek
  base_url: https://api.deepseek.com/v1/

mcp:
  fetch:
    command: ["uvx", "mcp-server-fetch", "--ignore-robots-txt"]
  
  search:
    command: ["uvx", "--from", "git+https://github.com/varlabz/searxng-mcp", "searxng-mcp"]

Complete Configuration Example

agent:
  instructions: |
    You are a helpful AI assistant with access to various tools and services.
    Provide accurate, helpful, and concise responses. When using tools, explain
    what you're doing and why. Be proactive in suggesting useful tools when appropriate.

llm:
  model: "openai:gpt-4o"
  api_key: "env:OPENAI_API_KEY"
  # Alternative providers:
  # model: "openrouter:anthropic/claude-3.5-sonnet"
  # api_key: "env:OPENROUTER_API_KEY" take key from environment variable OPENROUTER_API_KEY
  # or
  # api_key: "file:path to openrouter key file"
  # or
  # api_key: "api key as is"

mcp:
  filesystem:
    command: ["npx", "-y", "@modelcontextprotocol/server-filesystem", "."]
  
  memory:
    command: ["npx", "-y", "@modelcontextprotocol/server-memory"]
  
  fetch:
    command: ["uvx", "mcp-server-fetch", "--ignore-robots-txt"]
  
  youtube:
    command: ["npx", "-y", "https://github.com/varlabz/youtube-mcp", "--mcp"]
    
  sequential_thinking:
    command: ["npx", "-y", "@modelcontextprotocol/server-sequential-thinking"]
  
  searxng:
    command: ["uvx", "--from", "git+https://github.com/varlabz/searxng-mcp", "searxng-mcp"]

Use as MCP Server

ASK can extend other LLMs by running as an MCP server, providing access to your configured AI agent.

Claude Desktop Configuration (for example)

Add to your Claude Desktop config (~/Library/Application Support/Claude/claude_desktop_config.json):

{
  "mcpServers": {
    "ask": {
      "command": "uvx",
      "args": [
        "--from",
        "git+https://github.com/varlabz/ask",
        "ask-mcp",
        "-c",
        "/path/to/your/agent.yaml"
      ]
    }
  }
}

VS Code MCP Extension Configuration

Add to VS Code settings (mcp.json):

{
  "mcp_servers": {
    "ask": {
      "command": "uvx",
      "args": [
        "--from", 
        "git+https://github.com/varlabz/ask",
        "ask-mcp",
        "-c",
        "${workspaceFolder}/agent.yaml"
      ]
    }
  }
}

Development Setup

Clone and Setup

git clone https://github.com/varlabz/ask.git
cd ask
uv sync --extra dev

Run Tests

pytest 

Code Quality

This project uses Ruff for linting and formatting.

# Check for linting issues
uv run ruff check .

# Auto-fix linting issues
uv run ruff check --fix .

# Format code
uv run ruff format .

# Check formatting without changes
uv run ruff format --check .

Local Installation

pip install -e .