Thanks to visit codestin.com
Credit goes to github.com

Skip to content

rickz0rz/SwiftAI

Β 
Β 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

SwiftAI

A modern, type-safe Swift library for building AI-powered apps. SwiftAI provides a unified API that works seamlessly across different AI models - from Apple's on-device models to cloud-based services like OpenAI.

Swift 5.10+ iOS 17.0+ macOS 14.0+ License: MIT

✨ Features

  • πŸ€– Model Agnostic: Unified API across Apple's on-device models, OpenAI, Anthropic, and custom backends
  • 🎯 Structured Output: Strongly-typed structured outputs with compile-time validation
  • πŸ”§ Agent Tool Loop: First-class support for tool use
  • πŸ’¬ Conversations: Stateful chat sessions with automatic context management
  • πŸ—οΈ Extensible: Plugin architecture for custom models and tools
  • ⚑ Swift-Native: Built with async/await and modern Swift concurrency

πŸš€ Quick Start

import SwiftAI

let llm = SystemLLM()
let response = try await llm.reply(to: "What is the capital of France?")
print(response.content) // "Paris"

πŸ“¦ Installation

Swift Package Manager

Xcode:

  1. Go to File β†’ Add Package Dependencies
  2. Enter: https://github.com/mi12labs/SwiftAI
  3. Click Add Package

Package.swift:

dependencies: [
    .package(url: "https://github.com/mi12labs/SwiftAI", from: "main")
]

πŸ“š Getting Started

πŸš€ Step 1: Your First AI Query

Start with the simplest possible example - just ask a question and get an answer:

import SwiftAI

// Initialize Apple's on-device language model.
let llm = SystemLLM()

// Ask a question and get a response.
let response = try await llm.reply(to: "What is the capital of France?")
print(response.content) // "Paris"

What just happened?

  • SystemLLM() creates Apple's on-device AI model
  • reply(to:) sends your question and returns a String by default
  • try await handles the asynchronous AI processing
  • The response is wrapped in a response object - use .content to get the actual text

πŸ“Š Step 2: Structured Responses

Instead of getting plain text, let's get structured data that your app can use directly:

// Define the structure you want back
@Generable
struct CityInfo {
  let name: String
  let country: String
  let population: Int
}

let response = try await llm.reply(
  to: "Tell me about Tokyo",
  returning: CityInfo.self // Tell the LLM what to output
)

let cityInfo = response.content
print(cityInfo.name)       // "Tokyo"
print(cityInfo.country)    // "Japan"
print(cityInfo.population) // 13960000

What's new here?

  • @Generable tells SwiftAI this struct can be generated by AI
  • returning: CityInfo.self specifies you want structured data, not a string
  • SwiftAI automatically converts the AI's response into your struct
  • No JSON parsing required!

πŸ’‘ Key Concept: Type-Safe AI

SwiftAI ensures the AI returns data in exactly the format your code expects. If the AI can't generate valid data, you'll get an error instead of broken data.

πŸ› οΈ Step 3: Tool Use

Let your AI call functions in your app to get real-time information:

// Create a tool the AI can use
struct WeatherTool: Tool {
  let description = "Get current weather for a city"

  @Generable
  struct Arguments {
    let city: String
  }

  func call(arguments: Arguments) async throws -> String {
    // Your weather API logic here
    return "It's 72Β°F and sunny in \(arguments.city)"
  }
}

// Use the tool with your AI
let weatherTool = WeatherTool()
let response = try await llm.reply(
    to: "What's the weather like in San Francisco?",
    tools: [weatherTool]
)
print(response.content) // "Based on current data, it's 72Β°F and sunny in San Francisco"

What's new here?

  • Tool protocol lets you create functions the AI can call
  • Arguments struct defines what parameters your tool needs (also @Generable)
  • The AI automatically decides when to call your tool
  • You get back a natural language response that incorporates the tool's data

πŸ’‘ Key Concept: AI Function Calling

The AI reads your tool's description and automatically decides whether to call it. You don't manually trigger tools - the AI does it when needed.

πŸ”„ Step 4: Model Switching

Different AI models have different strengths. SwiftAI makes switching seamless:

// Choose your model based on availability
let llm: any LLM = {
  let systemLLM = SystemLLM()
  return systemLLM.isAvailable ? systemLLM : OpenaiLLM(apiKey: "your-api-key")
}()

// Same code works with any model
let response = try await llm.reply(to: "Write a haiku about Berlin.")
print(response.content)

What's new here?

  • SystemLLM runs on-device (private, fast, free)
  • OpenaiLLM uses the cloud (more capable, requires API key)
  • isAvailable checks if the on-device model is ready
  • Same reply() method works with any LLM

πŸ’‘ Key Concept: Model Agnostic API

Your code doesn't change when you switch models. This lets you optimize for different scenarios (privacy, capabilities, cost) without rewriting your app.

πŸ’¬ Step 5: Conversations

For multi-turn conversations, use Chat to maintain context across messages:

// Create a chat with tools
let chat = try Chat(with: llm, tools: [weatherTool])

// Have a conversation
let greeting = try await chat.send("Hello! I'm planning a trip.")
let advice = try await chat.send("What should I pack for Seattle?")
// The AI remembers context from previous messages

What's new here?

  • Chat maintains conversation history automatically
  • send() is like reply() but remembers previous messages
  • Tools work in conversations too
  • The AI remembers context from earlier in the conversation

πŸ’‘ Key Concept: Stateful vs Stateless

  • reply() is stateless - each call is independent
  • Chat is stateful - builds on previous conversation

🎯 Step 6: Advanced Constraints

Add validation rules and descriptions to guide AI generation:

@Generable
struct UserProfile {
  @Guide(description: "A valid username starting with a letter", .pattern("^[a-zA-Z][a-zA-Z0-9_]{2,}$"))
  let username: String

  @Guide(description: "User age in years", .minimum(13), .maximum(120))
  let age: Int

  @Guide(description: "One to three favorite colors", .minimumCount(1), .maximumCount(3))
  let favoriteColors: [String]
}

What's new here?

  • @Guide adds constraints and descriptions to fields which help LLM generate good content
  • .pattern() tells the LLM to follow a regex
  • .minimum() and .maximum() constrain numbers
  • .minimumCount() and .maximumCount() control array sizes

πŸ’‘ Key Concept: Validated Generation

Constraints ensure the AI follows your business rules.

🎯 Quick Reference

What You Want What To Use Example
Simple text response reply(to:) reply(to: "Hello")
Structured data reply(to:returning:) reply(to: "...", returning: MyStruct.self)
Function calling reply(to:tools:) reply(to: "...", tools: [myTool])
Conversation Chat chat.send("Hello")
Model switching any LLM SystemLLM() or OpenaiLLM()

πŸ”§ Supported Models

Model Type Privacy Capabilities Cost
SystemLLM On-device πŸ”’ Private Good πŸ†“ Free
OpenaiLLM Cloud API ⚠️ Shared Excellent πŸ’° Paid
CustomLLM Your choice Your choice Your choice Your choice

πŸ“– Examples

Check the Examples/ directory for sample apps that uses SwiftAI.

⚑ Feature Parity Status vs FoundationModels SDK

Feature Status
Streaming responses ❌ #issue
Structured outputs for enums ❌ #issue

🀝 Contributing

We welcome contributions! Please read our Contributing Guidelines.

Development Setup

git clone https://github.com/your-org/SwiftAI.git
cd SwiftAI
swift build
swift test

πŸ“„ License

SwiftAI is released under the MIT License. See LICENSE for details.

⚠️ Alpha ⚠️

SwiftAI is alpha 🚧 – rough edges and breaking changes are expected.


Built with ❀️ for the Swift community

About

Build beautiful and reliable LLM apps on iOS and MacOS

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Swift 100.0%