Schema to describe content of the BDF Knowledgebase (BDFKB), including tools and related entities as well as the relationships between them.
https://ARPA-H-BDF.github.io/bdfkb-schema
- src/ - source files (edit these)
- project/ Auto generated by LinkML tools, typically these are committed to source control, but they are ignored here until there is a specific use for them.
- docs/ - Auto generated Markdown using
GenDocand files inproject/docs - wip/ - Work in progress, including sub-schemas
bdfkb-schema provides a CI stage for you to intgrate into your existing CI process, ensuring your tool is always up to date with the latest schema requirements.
To consume, add the following job to your GitHub CI Actions:
validate-tool:
uses: ARPA-H-BDF/bdfkb-schema/.github/workflows/ci-validate.yaml@main
with:
python-version: '<YOUR PYTHON VERSION HERE (as a string)>'| Property | Description | Required? | Default |
|---|---|---|---|
| python-version | Minimum Python version supported by your application | True | '3.9' |
| validation-branch | Branch of bdfkb-data that you wish to compare against |
False | 'main' |
| schema-url | URL of bdfkb-schema file you wish to compare against |
False | 'https://raw.githubusercontent.com/ARPA-H-BDF/bdfkb-schema/refs/heads/main/src/bdfkb_schema/schema/bdfkb_schema.yaml' |
Note
Currently, only GitHub Actions are supportd. If you would like to use this in GitLab or another repository platform, feel free to create an MR or create an issue!
Details
Use the `make` command to generate project artefacts:make all: make everythingmake deploy: deploys site
poetry setup (recommended)
Create virtual env and install dependencies:
poetry install
uv setup
Installation:
uv run main.py- This will install all dependencies & use required Python version
- Install linkml tools (if not already installed):
uv tool install linkml
conda setup
- Create conda venv:
conda create -n "venv" python=3.9
- Activate venv:
conda activate venv
- Install dependencies:
pip install .
ER Diagram Generation
Create ER Diagram with Mermaid:
gen-erdiagram ./src/bdfkb_schema/schema/sample_import_schema/custom-llm-tool.yaml > mermaid.md
The LinkML cookiecutter uses make so this is the easiest way to get started. Here are some examples:
- Test all things
make test - Test schema
make test-schema - Generate pydantic classes
make gen-pydantic - And more!
Under the hood, the Makefile is basically running python and using the linkml generator tools. So you can run the same commands directly if you choose.
- Run tests
poetry run python -m unittest discover - Generate mermaid diagram of schema
gen-erdiagram -f mermaid src/bdfkb_schema/schema/bdfkb_schema.yaml > mermaid.md - And more!
uv is a modern tool for python that aims to replace pip, pip-tools, pipx, poetry, pyenv, twine, virtualenv, and more. Learn more here https://github.com/astral-sh/uv
Some of team uses uv, so you might find uv bits scattered around this project. However, since the LinkML cookiecutter project still uses poetry, this project is not yet fully embracing uv. Install uv and then uv tool install linkml and explore at your own risk.
In order to test a schema change on the existing dataset, one recommended approach is:
- Push your branch of changes from
bdfkb-schemato GitHub - Go online and copy the raw file url for
bdfkb-schema.yamlon your feature branch - Move to the
bdfkb-datarepository. Inmain.py, update theSCHEMA_PATHurl to the url you just copied. - Run the
main.pyfile using whichever runtime you prefer (uv, poetry, conda)
This project was made with linkml-project-cookiecutter, however much of what the cookie cutter provides is not needed here and thus removed. We opted to [mostly] stick with cookiecutter template so this would be familiar to someone already familiar with the LinkML project (and template).