Hypster is a lightweight configuration framework for managing and optimizing AI & ML workflows
⚠️ Hypster is in active development and not yet battle-tested in production. If you’re gaining value and want to promote it to production, please reach out!
- 🐍 Pythonic API: Intuitive & minimal syntax that feels natural to Python developers
- 🪆 Hierarchical, Conditional Configurations: Support for nested and swappable configurations
- 📐 Type Safety: Built-in type hints and validation
- 🧪 Hyperparameter Optimization Built-In: Native, first-class optuna support
You can install Hypster using uv:
uv add hypster
# optional HPO backend
uv add 'hypster[optuna]'Or using pip:
pip install hypsterDefine a configuration function and instantiate it with overrides:
from hypster import HP, instantiate
from llm import LLM
def llm_config(hp: HP):
model_name = hp.select(["gpt-4o-mini", "gpt-4o"], name="model_name")
temperature = hp.float(0.7, name="temperature", min=0.0, max=1.0)
max_tokens = hp.int(256, name="max_tokens", min=1, max=4096)
llm = LLM(model_name=model_name, temperature=temperature, max_tokens=max_tokens)
return llm
llm = instantiate(llm_config, values={"model_name": "gpt-4o-mini", "temperature": 0.3})
llm.invoke("How's your day going?")import optuna
from hypster.hpo.types import HpoInt, HpoFloat, HpoCategorical
from hypster.hpo.optuna import suggest_values
def objective(trial: optuna.Trial) -> float:
values = suggest_values(trial, config=model_cfg)
model = instantiate(model_cfg, values=values)
X, y = make_classification(
n_samples=400, n_features=20, n_informative=10, random_state=42
)
return cross_val_score(model, X, y, cv=3, n_jobs=-1).mean()
study = optuna.create_study(direction="maximize")
study.optimize(objective, n_trials=30)Hypster draws inspiration from Meta's hydra and hydra-zen framework. The API design is influenced by Optuna's "define-by-run" API.
Contributions are welcome! Please feel free to submit a Pull Request.
This project is licensed under the MIT License - see the LICENSE file for details.