Thanks to visit codestin.com
Credit goes to Github.com

Skip to content

LoRA support in transformers #96

@tanmayshishodia

Description

@tanmayshishodia

Hi I went through the praxis.transformers.StackedTransformer layer and I don't see any support for LoRA.

That said I was wondering if there was a way to add a set of new LoRA weights to an already existing paxConfig model. If you could give any example that would be great. The tutorials don't cover any such use case where we can update the model layers later.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions