Run nix run github:jim3692/koboldcpp-flake
It downloads Meta-Llama-3.1-8B-Instruct-Q4_K_M.gguf using huggingface-cli, if it doesn't already exist in ~/.cache/huggingface, and then starts KoboldCpp in Vulkan mode.
| Name | Name | Last commit date | ||
|---|---|---|---|---|
Run nix run github:jim3692/koboldcpp-flake
It downloads Meta-Llama-3.1-8B-Instruct-Q4_K_M.gguf using huggingface-cli, if it doesn't already exist in ~/.cache/huggingface, and then starts KoboldCpp in Vulkan mode.