Conversation
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## master #5726 +/- ##
===========================================
- Coverage 71.58% 39.89% -31.69%
===========================================
Files 760 760
Lines 69841 69845 +4
===========================================
- Hits 49996 27865 -22131
- Misses 19845 41980 +22135
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. |
|
@simpleoier Hi, could you please have a check to see if your concern on the default behavior is fixed by this PR? |
|
Sorry, I may not fully understand the purpose of this PR. Also, my original concern was that LoRA could not be used for HuBERT / WavLM / Wav2vec2 models in the current codebase. |
|
This PR is only to support the option of enabling |
OK, now I understand the point. Thanks for the explanation.
@ftshijt I was trying to recall that. Maybe my previous concern was that the previous configs (maybe pretrained models) have set the arguments as use_adapter: true
adapter: lora
adapter_conf: ...It would be better to update those configs if there are any. |
|
@simpleoier For the |
|
Hi @Stanwang1210 , using |
|
So, do we have any further problems about this PR? |
|
This pull request is now in conflict :( |
What?
I add an option
lora_onlyfor enablinglora.mark_only_lora_as_trainable, which freezes all parameters except for LoRA layers. The default value oflora_onlyis set to True.Why?
As suggested by @ftshijt , it's better for us to provide this option.
See also