add param mapping to docs

This commit is contained in:
Krrish Dholakia 2023-10-03 09:20:58 -07:00
parent 512769e841
commit 430e2698a2
2 changed files with 28 additions and 3 deletions

View file

@ -53,4 +53,29 @@ The Input params are **exactly the same** as the <a href="https://platform.opena
- `logit_bias`: *map (optional)* - Used to modify the probability of specific tokens appearing in the completion.
- `user`: *string (optional)* - A unique identifier representing your end-user. This can help OpenAI to monitor and detect abuse.
- `user`: *string (optional)* - A unique identifier representing your end-user. This can help OpenAI to monitor and detect abuse.
# Params supported across providers
This is a list of openai params we translate across providers. You can send any provider-specific param by just including it in completion().
E.g. If Anthropic supports top_k, then `completion(model="claude-2", .., top_k=3)` would send the value straight to Anthropic.
This list is constantly being updated.
| Provider | functions | function_call | temperature | top_p | n | stream | stop | max_tokens | presence_penalty | frequency_penalty | logit_bias | user |
|---|---|---|---|---|---|---|---|---|---|---|---|---|
|Anthropic| | | ✅ | ✅ | | ✅ | ✅ | ✅ | | |
|OpenAI| ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
|Replicate| | | ✅ | ✅ | ✅ | ✅ | ✅ | ✅| | |
|Cohere| | | ✅ | ✅ | | ✅ | | ✅| | | ✅ |
|Huggingface| | | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | | |
|Openrouter| ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
|AI21| | | | | | | | | | | |
|VertexAI| | | ✅ | ✅ | | ✅ | | ✅ | | | |
|Bedrock| | | ✅ | ✅ | | ✅ | ✅ | ✅ | | | |
|Sagemaker| | | ✅ | | | ✅ | | ✅ | | | |
|TogetherAI| | | ✅ | ✅ | | ✅ | ✅ | ✅ | | ✅ | |
|AlephAlpha| | | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | |
|Palm| | | ✅ | ✅ | | ✅ | | | | | |
|NLP Cloud| | | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | |
|Petals| | | ✅ | ✅ | | | | ✅ | | |

View file

@ -217,8 +217,8 @@ def completion(
metadata = kwargs.get('metadata', None)
fallbacks = kwargs.get('fallbacks', [])
######## end of unpacking kwargs ###########
openai_params = ["functions", "function_call", "temperature", "temperature", "top_p", "n", "stream", "stop", "max_tokens", "presence_penalty", "frequency_penalty", "logit_bias", "user", "metadata"]
litellm_params = ["acompletion", "caching", "return_async", "mock_response", "api_key", "api_version", "api_base", "force_timeout", "logger_fn", "verbose", "custom_llm_provider", "litellm_logging_obj", "litellm_call_id", "use_client", "id", "metadata", "fallbacks"]
openai_params = ["functions", "function_call", "temperature", "temperature", "top_p", "n", "stream", "stop", "max_tokens", "presence_penalty", "frequency_penalty", "logit_bias", "user"]
litellm_params = ["metadata", "acompletion", "caching", "return_async", "mock_response", "api_key", "api_version", "api_base", "force_timeout", "logger_fn", "verbose", "custom_llm_provider", "litellm_logging_obj", "litellm_call_id", "use_client", "id", "metadata", "fallbacks"]
default_params = openai_params + litellm_params
non_default_params = {k: v for k,v in kwargs.items() if k not in default_params} # model-specific params - pass them straight to the model/provider
if mock_response: