Thanks to visit codestin.com
Credit goes to github.com

Skip to content
/ macai Public

All-in-one native macOS LLM chat application: ChatGPT, Claude, Ollama, Grok, Google Gemini with Nano Banana, Perplexity, OpenRouter, Docker Models, and other API providers

License

Notifications You must be signed in to change notification settings

Renset/macai

Repository files navigation

macai

GitHub top language GitHub code size in bytes GitHub Workflow Status GitHub GitHub all releases

macai (macOS AI) is a simple yet powerful native macOS AI chat client that supports most AI providers: ChatGPT, Claude, xAI (Grok), Google Gemini, Perplexity, Ollama, OpenRouter, and almost any OpenAI-compatible APIs.

Downloads

Manual

Download latest universal binary, notarized by Apple.

Homebrew

Install macai cask with homebrew: brew install --cask macai

Build from source

Checkout main branch and open project in Xcode 14.3 or later

Contributions

Contributions are welcome. Take a look at Issues page to see already added features/bugs before creating new one. You can also support project by funding. This support is very important for me and allows to focus more on macai development.

Buy Me A Coffee

Why macai

  • macOS-native and lightweight
  • User-friendly: simple setup, minimalist light/dark UI
  • Feature-rich: vision, image generation, search, reasoning, import/export and more
  • Private and secure: no telemetry or usage tracking

Run with ChatGPT, Claude, xAI or Google Gemini

To run macai with ChatGPT or Claude, you need to have an API token. API token is like password. You need to obtain the API token first to use any commercial LLM API. Most API services offer free credits on registering new account, so you can try most of them for free. Here is how to get API token for all supported services:

If you are new to LLM and don't want to pay for the tokens, take a look Ollama. It supports dozens of OpenSource LLM models that can run locally on Apple M1/M2/M3/M4 Macs.

Run with Ollama

Ollama is the open-source back-end for various LLM models. Run macai with Ollama is easy:

  1. Install Ollama from the official website

  2. Follow installation guides

  3. After installation, select model (llama3.1 or llama3.2 are recommended) and pull model using command in terminal: ollama pull <model>

  4. In macai settings, open API Service tab, add new API service (Expert mode) and select type Ollama":

  5. Select model, and default AI Assistant and save

  6. Test and enjoy!

System requirements

macOS 14.0 and later (both Intel and Apple chips are supported)

Project status

Project is in the active development phase.

License

Apache-2.0

About

All-in-one native macOS LLM chat application: ChatGPT, Claude, Ollama, Grok, Google Gemini with Nano Banana, Perplexity, OpenRouter, Docker Models, and other API providers

Topics

Resources

License

Stars

Watchers

Forks

Sponsor this project

  •  

Packages

No packages published