macai (macOS AI) is a simple yet powerful native macOS AI chat client that supports most AI providers: ChatGPT, Claude, xAI (Grok), Google Gemini, Perplexity, Ollama, OpenRouter, and almost any OpenAI-compatible APIs.
Download latest universal binary, notarized by Apple.
Install macai cask with homebrew:
brew install --cask macai
Checkout main branch and open project in Xcode 14.3 or later
Contributions are welcome. Take a look at Issues page to see already added features/bugs before creating new one. You can also support project by funding. This support is very important for me and allows to focus more on macai development.
- macOS-native and lightweight
- User-friendly: simple setup, minimalist light/dark UI
- Feature-rich: vision, image generation, search, reasoning, import/export and more
- Private and secure: no telemetry or usage tracking
To run macai with ChatGPT or Claude, you need to have an API token. API token is like password. You need to obtain the API token first to use any commercial LLM API. Most API services offer free credits on registering new account, so you can try most of them for free. Here is how to get API token for all supported services:
- OpenAI: https://help.openai.com/en/articles/4936850-where-do-i-find-my-secret-api-key
- Claude: https://docs.anthropic.com/en/api/getting-started
- Google Gemini: https://ai.google.dev/gemini-api/docs/api-key
- xAI Grok: https://docs.x.ai/docs#models
- OpenRouter: https://openrouter.ai/docs/api-reference/authentication#using-an-api-key
If you are new to LLM and don't want to pay for the tokens, take a look Ollama. It supports dozens of OpenSource LLM models that can run locally on Apple M1/M2/M3/M4 Macs.
Run with Ollama
Ollama is the open-source back-end for various LLM models. Run macai with Ollama is easy:
-
Install Ollama from the official website
-
Follow installation guides
-
After installation, select model (llama3.1 or llama3.2 are recommended) and pull model using command in terminal:
ollama pull <model> -
In macai settings, open API Service tab, add new API service (Expert mode) and select type Ollama":
-
Select model, and default AI Assistant and save
-
Test and enjoy!
macOS 14.0 and later (both Intel and Apple chips are supported)
Project is in the active development phase.