Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Conversation

Green-Sky
Copy link
Contributor

@Green-Sky Green-Sky commented Jun 29, 2025

Very rough code, but works.
Useful for avoiding the lora overhead during inference, or just mixing your own model from loras.

Patch based on (outdated) @stduhpf / bleedingedge : Green-Sky@69028c7

Keep in mind that the weights are first quantized, and then it tries to apply the loras. (which is very not ideal)

Example

merging lcm lora into a sd1.5 finetune:

$ result/bin/sd -M convert -m models/CyberRealistic_V9_FP16.safetensors --lora-model-dir models/loras/sd1/ -p "<lora:lcm-lora-sdv1-5:1>" -o models/CyberRealistic_V9-lcm-q8_0.gguf
[INFO ] model.cpp:910  - load models/CyberRealistic_V9_FP16.safetensors using safetensors format
[INFO ] model.cpp:2106 - lora lcm-lora-sdv1-5:1.00
[INFO ] model.cpp:2121 - found at 'models/loras/sd1/lcm-lora-sdv1-5.safetensors'
[INFO ] model.cpp:1987 - model tensors mem size: 2035.18MB
  |==================================================| 1131/1131 - 500.00it/s
[INFO ] model.cpp:2032 - load tensors done
[INFO ] model.cpp:910  - load models/loras/sd1/lcm-lora-sdv1-5.safetensors using safetensors format
[INFO ] lora.hpp:117  - loading LoRA from 'models/loras/sd1/lcm-lora-sdv1-5.safetensors'
  |==================================================| 834/834 - 0.00it/s
[INFO ] model.cpp:2047 - applied 'models/loras/sd1/lcm-lora-sdv1-5.safetensors':1.000000
[INFO ] model.cpp:2052 - trying to save tensors to models/CyberRealistic_V9-lcm-q8_0.gguf
convert 'models/CyberRealistic_V9_FP16.safetensors'/'' to 'models/CyberRealistic_V9-lcm-q8_0.gguf' success

@Green-Sky
Copy link
Contributor Author

ggml switched to c++17, sd.cpp should too.

@Green-Sky Green-Sky changed the title merge in loras during model conversion Apply LoRA during model conversion Jun 29, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant