Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Conversation

@keyute
Copy link
Contributor

@keyute keyute commented May 8, 2025

Title

Adds more perplexity models such as Sonar, Sonar Pro, Sonar Reasoning and Sonar Reasoning Pro

Relevant issues

#10537 has removed some perplexity models which I use and I have added it back with more accurate pricing and numbers using the format from Sonar Reasoning Pro

Pre-Submission checklist

Please complete all items before asking a LiteLLM maintainer to review your PR

  • I have Added testing in the tests/litellm/ directory, Adding at least 1 test is a hard requirement - see details
  • I have added a screenshot of my new test passing locally
  • My PR passes all unit tests on make test-unit
  • My PR's scope is as isolated as possible, it only solves 1 specific problem

Type

🆕 New Feature

Changes

  • Added Sonar, Sonar Pro, Sonar Reasoning and Sonar Reasoning Pro models using the pattern from Sonar Deep Research

@vercel
Copy link

vercel bot commented May 8, 2025

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
litellm ✅ Ready (Inspect) Visit Preview 💬 Add feedback May 8, 2025 7:24am

@CLAassistant
Copy link

CLAassistant commented May 8, 2025

CLA assistant check
All committers have signed the CLA.

@keyute
Copy link
Contributor Author

keyute commented May 8, 2025

Sonar Deep Research might require some reviews as well as it has a context length of 128k. Only Sonar Pro has a max output token as per the docs while it is not mentioned with other models.

Copy link
Contributor

@ishaan-jaff ishaan-jaff left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

reviewed

"litellm_provider": "perplexity",
"mode": "chat",
"search_context_cost_per_query": {
"search_context_size_low": 5e-3,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

where did you get this from ? Can you add source

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"input_cost_per_token": 2e-6,
"output_cost_per_token": 8e-6,
"output_cost_per_reasoning_token": 3e-5,
"output_cost_per_reasoning_token": 3e-6,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can you add a source or screenshot for this change from perplexity pricing

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

https://docs.perplexity.ai/models/models/sonar-deep-research i think its a typo here. pricing is per million tokens which should be e-6 not e-5

Copy link
Contributor

@ishaan-jaff ishaan-jaff left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@ishaan-jaff ishaan-jaff merged commit 416429e into BerriAI:main May 8, 2025
5 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants