-
Notifications
You must be signed in to change notification settings - Fork 115
Open
Labels
type/featureIssue or PR related to a new featureIssue or PR related to a new feature
Description
Describe the feature you'd like to request
Hi, I'm running into an issue where I cannot load the large model. I have 2 GPUs, each with 8GB Vram. I'm wondering if it's possible to split the work between 2 GPUs?
If not, is it possible to disable the large model from the UX side and default to another size, like medium?
Describe the solution you'd like
No response
Metadata
Metadata
Assignees
Labels
type/featureIssue or PR related to a new featureIssue or PR related to a new feature