You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am fine tuning various models which have different styles of prompts. In inference I need to style the prompts adequately for each model. It would be great if it was possible to save a metadata or config json when creating the model, and retrieve it with the model. Otherwise I would need to either: (1) manage my own external database which is cumbersome when using google colab (2) hack the model suffix to encode my config params (let's not do that), or (3) reverse engineer my training jsonl to deduce the model's specific prompt styling, which is difficult. Any other options?
The text was updated successfully, but these errors were encountered:
I am fine tuning various models which have different styles of prompts. In inference I need to style the prompts adequately for each model. It would be great if it was possible to save a metadata or config json when creating the model, and retrieve it with the model. Otherwise I would need to either: (1) manage my own external database which is cumbersome when using google colab (2) hack the model suffix to encode my config params (let's not do that), or (3) reverse engineer my training jsonl to deduce the model's specific prompt styling, which is difficult. Any other options?
The text was updated successfully, but these errors were encountered: