Thanks to visit codestin.com
Credit goes to github.com

Skip to content

FastAPI helper library for Vercel AI SDK backend implementation - Stream AI responses from FastAPI to Next.js with full type safety and SSE support

License

Notifications You must be signed in to change notification settings

doganarif/fastapi-ai-sdk

Repository files navigation

FastAPI AI SDK

Python 3.9+ FastAPI License: MIT Code style: black

A Pythonic helper library for building FastAPI applications that integrate with the Vercel AI SDK. This library provides a seamless way to stream AI responses from your FastAPI backend to your Next.js frontend.

Features

  • Full Vercel AI SDK Compatibility - Implements the complete AI SDK protocol specification
  • Type-Safe with Pydantic - Full type hints and validation for all events
  • Streaming Support - Built-in Server-Sent Events (SSE) streaming
  • Easy Integration - Simple decorators and utilities for FastAPI
  • Flexible Builder Pattern - Intuitive API for constructing AI streams
  • Well Tested - Comprehensive test coverage
  • Fully Documented - Complete documentation with examples

Installation

pip install fastapi-ai-sdk

Quick Start

Basic Example

from fastapi import FastAPI
from fastapi_ai_sdk import AIStreamBuilder, ai_endpoint

app = FastAPI()

@app.post("/api/chat")
@ai_endpoint()
async def chat(message: str):
    """Simple chat endpoint that streams a response."""
    builder = AIStreamBuilder()
    builder.text(f"You said: {message}")
    return builder

Frontend Integration (Next.js)

import { useChat } from "@ai-sdk/react";

export default function Chat() {
  const { messages, input, handleInputChange, handleSubmit } = useChat({
    api: "http://localhost:8000/api/chat",
  });

  return (
    <div>
      {messages.map((msg) => (
        <div key={msg.id}>{msg.content}</div>
      ))}
      <form onSubmit={handleSubmit}>
        <input value={input} onChange={handleInputChange} />
        <button type="submit">Send</button>
      </form>
    </div>
  );
}

Documentation

Stream Events

The library supports all Vercel AI SDK event types:

  • Message Lifecycle: start, finish
  • Text Streaming: text-start, text-delta, text-end
  • Reasoning: reasoning-start, reasoning-delta, reasoning-end
  • Tool Calls: tool-input-start, tool-input-delta, tool-input-available, tool-output-available
  • Structured Data: Custom data-* events
  • File References: URLs and documents
  • Error Handling: Error events with messages

Using the Stream Builder

from fastapi_ai_sdk import AIStreamBuilder

# Create a builder
builder = AIStreamBuilder(message_id="optional_id")

# Add different types of content
builder.start()  # Start the stream
builder.text("Here's some text")  # Add text content
builder.reasoning("Let me think about this...")  # Add reasoning
builder.data("weather", {"temperature": 20, "city": "Berlin"})  # Add structured data
builder.tool_call(  # Add tool usage
    "get_weather",
    input_data={"city": "Berlin"},
    output_data={"temperature": 20}
)
builder.finish()  # End the stream

# Build and return the stream
return builder.build()

Decorators

@ai_endpoint - Automatic AI SDK Response Handling

@app.post("/chat")
@ai_endpoint()
async def chat(message: str):
    builder = AIStreamBuilder()
    builder.text(f"Response: {message}")
    return builder

@streaming_endpoint - Simple Text Streaming

@app.get("/stream")
@streaming_endpoint(chunk_size=10, delay=0.1)
async def stream():
    return "This text will be streamed chunk by chunk"

@tool_endpoint - Tool Call Handling

@app.post("/tools/weather")
@tool_endpoint("get_weather")
async def get_weather(city: str):
    # Your tool logic here
    return {"temperature": 20, "condition": "sunny"}

Advanced Examples

Streaming with Reasoning and Tools

@app.post("/api/advanced-chat")
@ai_endpoint()
async def advanced_chat(query: str):
    builder = AIStreamBuilder()

    # Start with reasoning
    builder.reasoning("Analyzing your query...")

    # Make a tool call
    weather_data = await get_weather_data("Berlin")
    builder.tool_call(
        "get_weather",
        input_data={"city": "Berlin"},
        output_data=weather_data
    )

    # Stream the response
    builder.text(f"Based on the weather data: {weather_data}")

    return builder

Custom Async Generators

from fastapi_ai_sdk import create_ai_stream_response

@app.get("/api/generate")
async def generate():
    async def event_generator():
        from fastapi_ai_sdk.models import StartEvent, TextDeltaEvent, FinishEvent

        yield StartEvent(message_id="gen_1")

        for word in ["Hello", " ", "from", " ", "FastAPI"]:
            yield TextDeltaEvent(id="txt_1", delta=word)
            await asyncio.sleep(0.1)

        yield FinishEvent()

    return create_ai_stream_response(event_generator())

Chunked Text Streaming

@app.post("/api/story")
@ai_endpoint()
async def generate_story(prompt: str):
    builder = AIStreamBuilder()

    story = await generate_long_story(prompt)  # Your story generation logic

    # Stream with custom chunk size
    builder.text(story, chunk_size=50)  # Streams in 50-character chunks

    return builder

Testing

Run the test suite:

# Install dev dependencies
pip install -e ".[dev]"

# Run tests with coverage
pytest --cov=fastapi_ai_sdk --cov-report=term-missing

# Run specific test file
pytest tests/test_models.py

# Run with verbose output
pytest -v

Development

Setup Development Environment

# Clone the repository
git clone https://github.com/doganarif/fastapi-ai-sdk.git
cd fastapi-ai-sdk

# Create virtual environment
python -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate

# Install in development mode
pip install -e ".[dev]"

# Run linting
black fastapi_ai_sdk tests
isort fastapi_ai_sdk tests
flake8 fastapi_ai_sdk tests
mypy fastapi_ai_sdk

Code Style

This project uses:

  • Black for code formatting
  • isort for import sorting
  • flake8 for linting
  • mypy for type checking

๐Ÿ“„ License

MIT License - see LICENSE file for details.

Author

Arif Dogan

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

  1. Fork the repository
  2. Create your feature branch (git checkout -b feature/AmazingFeature)
  3. Commit your changes (git commit -m 'Add some AmazingFeature')
  4. Push to the branch (git push origin feature/AmazingFeature)
  5. Open a Pull Request

Acknowledgments

Resources


Made with โค๏ธ for the FastAPI and AI community from Arif

About

FastAPI helper library for Vercel AI SDK backend implementation - Stream AI responses from FastAPI to Next.js with full type safety and SSE support

Topics

Resources

License

Contributing

Stars

Watchers

Forks

Packages

No packages published