-
Ensure you have a
docker/config.envwith the necessary config keys (any${VARIABLE}in thedocker/docker-compose.ymlfile) -
tasks/docker-start.sh -
Configure connection credentials
-
PostgreSQL credentials
- Ensure the new postgresql instance has a user for the optuna db and a different one for the mlflow db
- Create a new database called optuna and another called mlflow (if they don't already exist)
-
MinIO credentials
- create a new access key either using
mcCLI client or the console terminal (usually found at localhost:9001) - create a new bucket called mlflow
- create a new access key either using
-
Update
docker/config.envwith the newly-updated credentials -
docker-compose down and repeate step
2. -
Update connection variables in
create_study.py
-
-
python -m pip install -r requirements.txt -
python run_trials.py -
Check run in mlflow tracking URL
-
Couldn't load subscription status.
- Fork 0
inigohidalgo/mlflow-minio-docker
Folders and files
| Name | Name | Last commit message | Last commit date | |
|---|---|---|---|---|
Repository files navigation
About
Local deployment of MLFlow using docker and MinIO (local s3) as storage backend