Backend API handlers for Cloudflare Workers AI integration in Next.js and Edge Runtime applications.
Looking for React components? Check out edgepilot-ui for pre-built chat interfaces and components.
π― Examples | π Docs | π edgepilot.dev (coming soon)
EdgePilot provides backend API handlers that connect your Next.js app to Cloudflare Workers AI, enabling:
- Zero cold-start AI inference at the edge
- Significantly lower costs with Cloudflare's edge pricing model
- Global edge deployment on Cloudflare's network
We believe in separation of concerns:
- Use ANY UI framework (React, Vue, Svelte)
- Bring your own component library (MUI, Ant, Tailwind)
- No forced UI opinions
- Smaller bundle size
Want pre-built components? Check out edgepilot-ui!
- π Zero Cold Start: Leverage Cloudflare's global edge network for instant AI responses
- β‘ Next.js Integration: Drop-in API route handlers for Next.js 13.4+ App Router
- π‘ Streaming Support: Real-time streaming responses with Server-Sent Events
- π€ Multiple Models: Support for Llama 3.1, Mistral, Qwen, and 50+ other models
- π¦ Type-Safe: Full TypeScript support with type definitions
- π Edge Runtime: Optimized for Vercel Edge, Cloudflare Workers, and other edge platforms
- π Automatic Retries: Built-in retry logic with exponential backoff
- πΎ Response Caching: Optional caching for repeated queries
npm install edgepilot-ai
# or
pnpm add edgepilot-ai
# or
yarn add edgepilot-aiThis package provides the backend API handlers. For a complete example with UI components, see our starter example.
First, get your Cloudflare credentials:
- Go to Cloudflare Dashboard β My Profile β API Tokens
- Click Create Token β Use Custom token template
- Add permission: Account β Cloudflare Workers AI β Edit
- Copy your token and account ID
Create a .env.local file:
CLOUDFLARE_API_TOKEN=your-api-token
CLOUDFLARE_ACCOUNT_ID=your-account-idCreate app/api/ai/chat/route.ts:
import { createNextHandler } from 'edgepilot-ai/next';
export const runtime = 'edge';
const handler = createNextHandler({
model: '@cf/meta/llama-3.1-8b-instruct',
stream: true,
});
export const POST = handler;import { useChat } from 'your-favorite-chat-library';
export function ChatComponent() {
const { messages, input, handleInputChange, handleSubmit } = useChat({
api: '/api/ai/chat',
});
return (
<form onSubmit={handleSubmit}>
<input value={input} onChange={handleInputChange} />
<button type="submit">Send</button>
{messages.map(m => (
<div key={m.id}>{m.content}</div>
))}
</form>
);
}EdgePilot supports all Cloudflare Workers AI models:
@cf/meta/llama-3.1-8b-instruct- Best for real-time chat@cf/mistral/mistral-7b-instruct-v0.2- Great for summaries@cf/qwen/qwen1.5-7b-chat-awq- Multilingual support
@cf/meta/llama-3.3-70b-instruct- State-of-the-art responses@cf/meta/llama-3.1-70b-instruct- Best for complex reasoning
@cf/meta/llama-3-8b-instruct-awq- Ultra-fast with AWQ quantization@cf/deepseek-ai/deepseek-math-7b-instruct- Mathematical reasoning
See all available models in Cloudflare's documentation.
Creates a Next.js API route handler for AI chat endpoints.
import { createNextHandler } from 'edgepilot-ai/next';
import type { Config } from 'edgepilot-ai';
const handler = createNextHandler({
model: '@cf/meta/llama-3.1-8b-instruct', // AI model to use
stream: true, // Enable streaming
cache: false, // Response caching
debug: false, // Debug logging
});EdgePilot is fully typed. Import types for better IDE support:
import type {
Config,
Message,
HttpError,
StreamingOptions
} from 'edgepilot-ai';
// Type-safe configuration
const config: Config = {
model: '@cf/meta/llama-3.1-8b-instruct',
stream: true,
cache: false,
debug: process.env.NODE_ENV === 'development'
};
// Type-safe message handling
const messages: Message[] = [
{ role: 'system', content: 'You are a helpful assistant' },
{ role: 'user', content: 'Hello!' }
];model(string): The Cloudflare AI model to usestream(boolean): Enable streaming responses (default: true)cache(boolean): Enable response caching (default: false)debug(boolean): Enable debug logging (default: false)
Check out the examples/starter directory for a complete working example with:
- Chat interface with streaming
- Model selection
- AI-enhanced textarea
- Theme support
- Mobile responsive design
To run the example:
cd examples/starter
pnpm install
pnpm devEdgePilot is part of a modular ecosystem:
| Package | Description | Status |
|---|---|---|
| edgepilot-ai | Backend API handlers for Cloudflare Workers AI | β Available |
| edgepilot-ui | Pre-built React components (ChatPopup, ModelSelector, etc.) | π§ Coming Soon |
| create-edgepilot | CLI to scaffold new projects | π Planned |
This separation allows you to:
- Use just the backend with your own UI
- Use pre-built components with
edgepilot-ui - Mix and match based on your needs
# Install dependencies
pnpm install
# Build the package
pnpm build
# Run in development mode
pnpm dev
# Run the example
pnpm example:devMIT
Audrey Klammer
Contributions are welcome! Please feel free to submit a Pull Request.
EdgePilot is an independent project and is not affiliated with, endorsed by, or associated with Open Text Corporation, Cloudflare, Inc., or any other company mentioned in this documentation. All trademarks are the property of their respective owners.