Thanks to visit codestin.com
Credit goes to github.com

Skip to content

edgepilot/edgepilot-ai

Repository files navigation

EdgePilot AI

npm version npm downloads

Backend API handlers for Cloudflare Workers AI integration in Next.js and Edge Runtime applications.

Looking for React components? Check out edgepilot-ui for pre-built chat interfaces and components.

🎯 Examples | πŸ“š Docs | 🌐 edgepilot.dev (coming soon)

What is EdgePilot?

EdgePilot provides backend API handlers that connect your Next.js app to Cloudflare Workers AI, enabling:

  • Zero cold-start AI inference at the edge
  • Significantly lower costs with Cloudflare's edge pricing model
  • Global edge deployment on Cloudflare's network

Why Backend-Only?

We believe in separation of concerns:

  • Use ANY UI framework (React, Vue, Svelte)
  • Bring your own component library (MUI, Ant, Tailwind)
  • No forced UI opinions
  • Smaller bundle size

Want pre-built components? Check out edgepilot-ui!

Features

  • πŸš€ Zero Cold Start: Leverage Cloudflare's global edge network for instant AI responses
  • ⚑ Next.js Integration: Drop-in API route handlers for Next.js 13.4+ App Router
  • πŸ“‘ Streaming Support: Real-time streaming responses with Server-Sent Events
  • πŸ€– Multiple Models: Support for Llama 3.1, Mistral, Qwen, and 50+ other models
  • πŸ“¦ Type-Safe: Full TypeScript support with type definitions
  • 🌍 Edge Runtime: Optimized for Vercel Edge, Cloudflare Workers, and other edge platforms
  • πŸ”„ Automatic Retries: Built-in retry logic with exponential backoff
  • πŸ’Ύ Response Caching: Optional caching for repeated queries

Installation

npm install edgepilot-ai
# or
pnpm add edgepilot-ai
# or
yarn add edgepilot-ai

Quick Start

This package provides the backend API handlers. For a complete example with UI components, see our starter example.

1. Set up environment variables

First, get your Cloudflare credentials:

  1. Go to Cloudflare Dashboard β†’ My Profile β†’ API Tokens
  2. Click Create Token β†’ Use Custom token template
  3. Add permission: Account β†’ Cloudflare Workers AI β†’ Edit
  4. Copy your token and account ID

Create a .env.local file:

CLOUDFLARE_API_TOKEN=your-api-token
CLOUDFLARE_ACCOUNT_ID=your-account-id

2. Create an API route

Create app/api/ai/chat/route.ts:

import { createNextHandler } from 'edgepilot-ai/next';

export const runtime = 'edge';

const handler = createNextHandler({
  model: '@cf/meta/llama-3.1-8b-instruct',
  stream: true,
});

export const POST = handler;

3. Use in your components

import { useChat } from 'your-favorite-chat-library';

export function ChatComponent() {
  const { messages, input, handleInputChange, handleSubmit } = useChat({
    api: '/api/ai/chat',
  });

  return (
    <form onSubmit={handleSubmit}>
      <input value={input} onChange={handleInputChange} />
      <button type="submit">Send</button>
      {messages.map(m => (
        <div key={m.id}>{m.content}</div>
      ))}
    </form>
  );
}

Available Models

EdgePilot supports all Cloudflare Workers AI models:

Fast Response (Under 100ms)

  • @cf/meta/llama-3.1-8b-instruct - Best for real-time chat
  • @cf/mistral/mistral-7b-instruct-v0.2 - Great for summaries
  • @cf/qwen/qwen1.5-7b-chat-awq - Multilingual support

High Quality (100-500ms)

  • @cf/meta/llama-3.3-70b-instruct - State-of-the-art responses
  • @cf/meta/llama-3.1-70b-instruct - Best for complex reasoning

Specialized

  • @cf/meta/llama-3-8b-instruct-awq - Ultra-fast with AWQ quantization
  • @cf/deepseek-ai/deepseek-math-7b-instruct - Mathematical reasoning

See all available models in Cloudflare's documentation.

API Reference

createNextHandler(options)

Creates a Next.js API route handler for AI chat endpoints.

import { createNextHandler } from 'edgepilot-ai/next';
import type { Config } from 'edgepilot-ai';

const handler = createNextHandler({
  model: '@cf/meta/llama-3.1-8b-instruct', // AI model to use
  stream: true,                             // Enable streaming
  cache: false,                             // Response caching
  debug: false,                             // Debug logging
});

TypeScript Usage

EdgePilot is fully typed. Import types for better IDE support:

import type {
  Config,
  Message,
  HttpError,
  StreamingOptions
} from 'edgepilot-ai';

// Type-safe configuration
const config: Config = {
  model: '@cf/meta/llama-3.1-8b-instruct',
  stream: true,
  cache: false,
  debug: process.env.NODE_ENV === 'development'
};

// Type-safe message handling
const messages: Message[] = [
  { role: 'system', content: 'You are a helpful assistant' },
  { role: 'user', content: 'Hello!' }
];

Options

  • model (string): The Cloudflare AI model to use
  • stream (boolean): Enable streaming responses (default: true)
  • cache (boolean): Enable response caching (default: false)
  • debug (boolean): Enable debug logging (default: false)

Examples

Check out the examples/starter directory for a complete working example with:

  • Chat interface with streaming
  • Model selection
  • AI-enhanced textarea
  • Theme support
  • Mobile responsive design

To run the example:

cd examples/starter
pnpm install
pnpm dev

Package Ecosystem

EdgePilot is part of a modular ecosystem:

Package Description Status
edgepilot-ai Backend API handlers for Cloudflare Workers AI βœ… Available
edgepilot-ui Pre-built React components (ChatPopup, ModelSelector, etc.) 🚧 Coming Soon
create-edgepilot CLI to scaffold new projects πŸ“‹ Planned

This separation allows you to:

  • Use just the backend with your own UI
  • Use pre-built components with edgepilot-ui
  • Mix and match based on your needs

Development

# Install dependencies
pnpm install

# Build the package
pnpm build

# Run in development mode
pnpm dev

# Run the example
pnpm example:dev

License

MIT

Author

Audrey Klammer

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

Disclaimer

EdgePilot is an independent project and is not affiliated with, endorsed by, or associated with Open Text Corporation, Cloudflare, Inc., or any other company mentioned in this documentation. All trademarks are the property of their respective owners.

About

AI assistant components for Cloudflare Workers - pilot your edge AI

Resources

License

Stars

Watchers

Forks

Packages

No packages published