Add support for perplexity/sonar-reasoning#633
Conversation
|
| model: this.getModel().id, | ||
| max_tokens: maxTokens, | ||
| temperature: temperature, | ||
| top_p: topP, |
There was a problem hiding this comment.
Is there any chance of this breaking API calls to providers who don't support top_p? If so we might consider doing what we do with the transforms field below.
There was a problem hiding this comment.
This is the same as with include_reasoning: it's passed to every model, not only for deepseek.
From oupenrouter docs:
Non-standard parameters: If the chosen model doesn't support a request parameter (such as logit_bias in non-OpenAI models, or top_k for OpenAI), then the parameter is ignored. The rest are forwarded to the underlying model API.
Adds support for
perplexity/sonar-reasoningmodel from openrouter, It's based on deepseek-r1 so it is threaded in the same way. I noticed that some providers allow settingtop_p, I set it to value used in deepseek benchmarks.Description
Type of change
How Has This Been Tested?
Checklist:
Additional context
Related Issues
Reviewers
Important
Add support for
perplexity/sonar-reasoningmodel inOpenRouterHandlerwith specific parameters.perplexity/sonar-reasoningmodel inOpenRouterHandler.temperatureto 0.6 andtop_pto 0.95 forperplexity/sonar-reasoninganddeepseek-r1models.openrouter.ts.This description was created by
for 1534a9c. It will automatically update as commits are pushed.