Using --chat-template #8009
Unanswered
Deputation
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi, I have been trying to get the chatml template to work with llama.cpp as the quality of the answers I am getting is really, really bad and sometimes the model hallucinates new roles, starts talking to itself, starts writing down notes... Something is off.
./llama-cli --verbose --model llama-3-8b-instruct-f16.gguf -p "System Prompt\n" -n -1 -fa -sp -if --mirostat 2 -co --penalize-nl -r "USER:" --in-suffix "AI:" --in-prefix "USER:" --chat-template chatml
Unfortunately nothing seems to be working properly (enabling -cnv, trying to write out the chatml format manually) even though chatml should be accepted as a --chat-template. I have never really managed to get chat working well in llama.cpp and I would like to be able to do so.
Thank you in advance for your time and patience.
Beta Was this translation helpful? Give feedback.
All reactions