You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Jul 4, 2025. It is now read-only.
Do we package the cuda toolkit to the engine?
Yes? Then will have to do the same for llamacpp, tensorrt-llm and onnx?
No? Will download separatedly
Folder structures (e.g if user have llamacpp, tensorrt at the same time)?
Resources Llamacpp release
Currently we are downloading toolkit dependency via https://catalog.jan.ai/dist/cuda-dependencies/<version>/<platform>/cuda.tar.gz