A Python package containing easy to use tools for working with various language models and AI services. AIMU is specifically designed for running models locally, using Ollama or Hugging Face Transformers. However, it can also be used with cloud models (OpenAI, Anthropic, Google, etc.) with aisuite support (in development).
-
Model Clients: Support for multiple AI model providers including:
- Ollama (local models)
- Hugging Face Transformers (local models)
- aisuite supported models (cloud and local models), including OpenAI (others coming)
-
MCP Tools: Model Context Protocol (MCP) client for enhancing AI capabilities. Provides a simple(r) interface for FastMCP 2.0.
-
Chat Conversation (Memory) Storage/Management: Chat conversation history management using TinyDB.
-
Prompt Storage/Management: Prompt catalog for storing and versioning prompts using SQLAlchemy.
In addition to the AIMU package in the 'aimu' directory, the AIMU code repository includes:
-
Jupyter notebooks demonstrating key AIMU features.
-
A example chat client, built with Streamlit, using AIMU Model Client, MCP tools support, and chat conversation management.
-
A full suite of Pytest tests.
AIMU can be installed with Ollama support, Hugging Face Transformers support, and/or aisuite (cloud models) support. For all features, run:
pip install aimu[all]Alternatively, for Ollama-only support:
pip install aimu[ollama]For Hugging Face Tranformers model support:
pip install aimu[hf]For aisuite models (e.g. OpenAI):
pip install aimu[aisuite]Once you've cloned the repository, run the following command to install all model dependencies:
pip install -e '.[all]'Additionally, run the following command to install development (testing, linting) and notebook dependencies:
pip install -e '.[dev,notebooks]'If you have uv installed, you can get all model and development dependencies with:
uv sync --all-extrasfrom aimu.models import OllamaClient as ModelClient ## or HuggingFaceClient, or AisuiteClient
model_client = ModelClient(ModelClient.MODEL_LLAMA_3_1_8B)
response = model_client.generate("What is the capital of France?", {"temperature": 0.7})from aimu.models import OllamaClient as ModelClient
model_client = ModelClient(ModelClient.MODELS.LLAMA_3_1_8B)
response = model_client.chat("What is the capital of France?")cd streamlit
streamlit run streamlit/chatbot_example.pyfrom aimu.tools import MCPClient
mcp_client = MCPClient({
"mcpServers": {
"mytools": {"command": "python", "args": ["tools.py"]},
}
})
mcp_client.call_tool("mytool", {"input": "hello world!"})from aimu.models import OllamaClient as ModelClient
from aimu.tools import MCPClient
mcp_client = MCPClient({
"mcpServers": {
"mytools": {"command": "python", "args": ["tools.py"]},
}
})
model_client = ModelClient(ModelClient.MODEL_LLAMA_3_1_8B)
model_client.mcp_client = mcp_client
model_client.chat("use my tool please")from aimu.models import OllamaClient as ModelClient
from aimu.memory import ConversationManager
chat_manager = ConversationManager("conversations.json", use_last_conversation=True) # loads the last saved convesation
model_client = new ModelClient(ModelClient.MODEL_LLAMA_3_1_8B)
model_client.messages = chat_manager.messages
model_client.chat("What is the capital of France?")
chat_manager.update_conversation(model_client.messages) # store the updated conversationfrom aimu.prompts import PromptCatalog, Prompt
prompt_catalog = PromptCatalog("prompts.db")
prompt = Prompt("You are a helpful assistant", model_id="llama3.1:8b", version=1)
prompt_catalog.store_prompt(prompt)This project is licensed under the Apache 2.0 license.