A modern, type-safe Swift library for building AI-powered apps. SwiftAI provides a unified API that works seamlessly across different AI models - from Apple's on-device models to cloud-based services like OpenAI.
- π€ Model Agnostic: Unified API across Apple's on-device models, OpenAI, Anthropic, and custom backends
- π― Structured Output: Strongly-typed structured outputs with compile-time validation
- π§ Agent Tool Loop: First-class support for tool use
- π¬ Conversations: Stateful chat sessions with automatic context management
- ποΈ Extensible: Plugin architecture for custom models and tools
- β‘ Swift-Native: Built with async/await and modern Swift concurrency
import SwiftAI
let llm = SystemLLM()
let response = try await llm.reply(to: "What is the capital of France?")
print(response.content) // "Paris"Xcode:
- Go to File β Add Package Dependencies
- Enter:
https://github.com/mi12labs/SwiftAI - Click Add Package
Package.swift:
dependencies: [
.package(url: "https://github.com/mi12labs/SwiftAI", from: "main")
]Start with the simplest possible example - just ask a question and get an answer:
import SwiftAI
// Initialize Apple's on-device language model.
let llm = SystemLLM()
// Ask a question and get a response.
let response = try await llm.reply(to: "What is the capital of France?")
print(response.content) // "Paris"What just happened?
SystemLLM()creates Apple's on-device AI modelreply(to:)sends your question and returns aStringby defaulttry awaithandles the asynchronous AI processing- The response is wrapped in a
responseobject - use.contentto get the actual text
Instead of getting plain text, let's get structured data that your app can use directly:
// Define the structure you want back
@Generable
struct CityInfo {
let name: String
let country: String
let population: Int
}
let response = try await llm.reply(
to: "Tell me about Tokyo",
returning: CityInfo.self // Tell the LLM what to output
)
let cityInfo = response.content
print(cityInfo.name) // "Tokyo"
print(cityInfo.country) // "Japan"
print(cityInfo.population) // 13960000What's new here?
@Generabletells SwiftAI this struct can be generated by AIreturning: CityInfo.selfspecifies you want structured data, not a string- SwiftAI automatically converts the AI's response into your struct
- No JSON parsing required!
SwiftAI ensures the AI returns data in exactly the format your code expects. If the AI can't generate valid data, you'll get an error instead of broken data.
Let your AI call functions in your app to get real-time information:
// Create a tool the AI can use
struct WeatherTool: Tool {
let description = "Get current weather for a city"
@Generable
struct Arguments {
let city: String
}
func call(arguments: Arguments) async throws -> String {
// Your weather API logic here
return "It's 72Β°F and sunny in \(arguments.city)"
}
}
// Use the tool with your AI
let weatherTool = WeatherTool()
let response = try await llm.reply(
to: "What's the weather like in San Francisco?",
tools: [weatherTool]
)
print(response.content) // "Based on current data, it's 72Β°F and sunny in San Francisco"What's new here?
Toolprotocol lets you create functions the AI can callArgumentsstruct defines what parameters your tool needs (also@Generable)- The AI automatically decides when to call your tool
- You get back a natural language response that incorporates the tool's data
The AI reads your tool's description and automatically decides whether to call it. You don't manually trigger tools - the AI does it when needed.
Different AI models have different strengths. SwiftAI makes switching seamless:
// Choose your model based on availability
let llm: any LLM = {
let systemLLM = SystemLLM()
return systemLLM.isAvailable ? systemLLM : OpenaiLLM(apiKey: "your-api-key")
}()
// Same code works with any model
let response = try await llm.reply(to: "Write a haiku about Berlin.")
print(response.content)What's new here?
SystemLLMruns on-device (private, fast, free)OpenaiLLMuses the cloud (more capable, requires API key)isAvailablechecks if the on-device model is ready- Same
reply()method works with any LLM
Your code doesn't change when you switch models. This lets you optimize for different scenarios (privacy, capabilities, cost) without rewriting your app.
For multi-turn conversations, use Chat to maintain context across messages:
// Create a chat with tools
let chat = try Chat(with: llm, tools: [weatherTool])
// Have a conversation
let greeting = try await chat.send("Hello! I'm planning a trip.")
let advice = try await chat.send("What should I pack for Seattle?")
// The AI remembers context from previous messagesWhat's new here?
Chatmaintains conversation history automaticallysend()is likereply()but remembers previous messages- Tools work in conversations too
- The AI remembers context from earlier in the conversation
reply()is stateless - each call is independentChatis stateful - builds on previous conversation
Add validation rules and descriptions to guide AI generation:
@Generable
struct UserProfile {
@Guide(description: "A valid username starting with a letter", .pattern("^[a-zA-Z][a-zA-Z0-9_]{2,}$"))
let username: String
@Guide(description: "User age in years", .minimum(13), .maximum(120))
let age: Int
@Guide(description: "One to three favorite colors", .minimumCount(1), .maximumCount(3))
let favoriteColors: [String]
}What's new here?
@Guideadds constraints and descriptions to fields which help LLM generate good content.pattern()tells the LLM to follow a regex.minimum()and.maximum()constrain numbers.minimumCount()and.maximumCount()control array sizes
Constraints ensure the AI follows your business rules.
| What You Want | What To Use | Example |
|---|---|---|
| Simple text response | reply(to:) |
reply(to: "Hello") |
| Structured data | reply(to:returning:) |
reply(to: "...", returning: MyStruct.self) |
| Function calling | reply(to:tools:) |
reply(to: "...", tools: [myTool]) |
| Conversation | Chat |
chat.send("Hello") |
| Model switching | any LLM |
SystemLLM() or OpenaiLLM() |
| Model | Type | Privacy | Capabilities | Cost |
|---|---|---|---|---|
| SystemLLM | On-device | π Private | Good | π Free |
| OpenaiLLM | Cloud API | Excellent | π° Paid | |
| CustomLLM | Your choice | Your choice | Your choice | Your choice |
Check the Examples/ directory for sample apps that uses SwiftAI.
| Feature | Status |
|---|---|
| Streaming responses | β #issue |
| Structured outputs for enums | β #issue |
We welcome contributions! Please read our Contributing Guidelines.
git clone https://github.com/your-org/SwiftAI.git
cd SwiftAI
swift build
swift testSwiftAI is released under the MIT License. See LICENSE for details.
SwiftAI is alpha π§ β rough edges and breaking changes are expected.
Built with β€οΈ for the Swift community