Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Added functionary v2.2 chat handler #1131

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 1 commit into from
Closed

Conversation

AbdoMahfoz
Copy link

@AbdoMahfoz AbdoMahfoz commented Jan 26, 2024

  • Supports streaming and no streaming
  • Tested on local project
  • I used the same code that generated schema for tool calls in functionary chat handler, rewrote almost everything else as they completely changed the format.
  • Built-in tokenizer had problems tokenizing special tokens like <|content|> and <|stop|> so I had to use transformers package to get an AutoTokenizer that works with this model and just plug it with llama.generate.
    • As a side effect of this, sentencepiece is now a requirement.
  • Updated readme to reflect support for the newest model the I tested as of now.
  • Chat format is set as functionary2 to retain compatibility with any code that uses the old formatter with older models.

Here are samples of the new chat format and old chat format.

Please note that current functionary handler didn't work with functionary model v1.4.

@abetlen
Copy link
Owner

abetlen commented May 14, 2024

@AbdoMahfoz thank you so much for the PR and sorry for responding so late, at the time jeffrey from the functionary team had just submitted the same work shortyl before you so I merged that one. Thank you again however.

@abetlen abetlen closed this May 14, 2024
@AbdoMahfoz
Copy link
Author

No worries ❤️. I needed it in a local project so it served me well all that time 😂

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants