You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I haven't had time to review all the new flux code but I solved this issue by adding "ggml_build_forward_expand(gf, zero_index);" to the end of build_lora_graph. My logic was that all the other places that use set_backend_tensor_data seem to resolve with that but for whatever reason the lora loading did not. I don't know if this is the intended solution but it did get LoRAs loading again for me.
When I try to apply LoRA to a quantized model, I get the following error:
This worked before. Same issue was encountered here: #291 (comment)
This commit 29ec316 appears to have broken something.
If it helps, here are the LoRAs and models I tried this with:
res-adapter
LoRAThe text was updated successfully, but these errors were encountered: