-
Notifications
You must be signed in to change notification settings - Fork 153
Fix: update from disable_adapter to disable_adapters #221
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
Hi we'd like to be able to reproduce the issue on our side in order to evaluate the PR. Will also comment on the issue you opened, unslothai/unsloth#3067, for further discussion. |
logits_to_keep = logits_to_keep + 1, | ||
).logits | ||
with torch.inference_mode(): | ||
unwarp_model = trainer.accelerator.unwrap_model(trainer.model, keep_fp32_wrapper = False) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
instead of an if else, could we create a conditional context? Something like
from contextlib import nullcontext
unwrap_model = trainer.accelerator.unwrap_model(trainer.model, keep_fp32_wrapper=False)
try:
disable_adapter_ctx = base.disable_adapter()
except AttributeError:
disable_adapter_ctx = nullcontext()
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for your comments!
- I'm sorry, what is
base
? - I think
try except
should be the last method to process this issue, I'm not sure if it's a good time to use it right now.
).logits | ||
with torch.inference_mode(): | ||
unwarp_model = trainer.accelerator.unwrap_model(trainer.model, keep_fp32_wrapper = False) | ||
if unwarp_model._hf_peft_config_loaded: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
just a small nit: unwarp_model
should be unwrap_model
Hi @wa008, I made some code comments. But one issue is that it should not be calling If you could update based on the comments I made we should be good to go. |
Thanks for your reply and suggestions. I’m sorry, I didn’t fully understand what you meant. If you’d like, you can go ahead and submit a new PR for this issue—I’m completely fine with that. |
Solve unlsoth issue-3064
relate link:
disable_adapters
function