-
-
Notifications
You must be signed in to change notification settings - Fork 5.3k
Add New Perplexity Models #10652
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add New Perplexity Models #10652
Conversation
|
The latest updates on your projects. Learn more about Vercel for Git ↗︎
|
|
Sonar Deep Research might require some reviews as well as it has a context length of 128k. Only Sonar Pro has a max output token as per the docs while it is not mentioned with other models. |
ishaan-jaff
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
reviewed
| "litellm_provider": "perplexity", | ||
| "mode": "chat", | ||
| "search_context_cost_per_query": { | ||
| "search_context_size_low": 5e-3, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
where did you get this from ? Can you add source
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| "input_cost_per_token": 2e-6, | ||
| "output_cost_per_token": 8e-6, | ||
| "output_cost_per_reasoning_token": 3e-5, | ||
| "output_cost_per_reasoning_token": 3e-6, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
can you add a source or screenshot for this change from perplexity pricing
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
https://docs.perplexity.ai/models/models/sonar-deep-research i think its a typo here. pricing is per million tokens which should be e-6 not e-5
ishaan-jaff
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
Title
Adds more perplexity models such as Sonar, Sonar Pro, Sonar Reasoning and Sonar Reasoning Pro
Relevant issues
#10537 has removed some perplexity models which I use and I have added it back with more accurate pricing and numbers using the format from Sonar Reasoning Pro
Pre-Submission checklist
Please complete all items before asking a LiteLLM maintainer to review your PR
tests/litellm/directory, Adding at least 1 test is a hard requirement - see detailsmake test-unitType
🆕 New Feature
Changes