forked from phoenix/litellm-mirror
docs(input.md): add docs on 'get_supported_openai_params'
This commit is contained in:
parent
b0fa25dfbd
commit
d8a6b8216d
1 changed files with 11 additions and 0 deletions
|
@ -24,6 +24,17 @@ print(response)
|
|||
```
|
||||
|
||||
### Translated OpenAI params
|
||||
|
||||
Use this function to get an up-to-date list of supported openai params for any model + provider.
|
||||
|
||||
```python
|
||||
from litellm import get_supported_openai_params
|
||||
|
||||
response = get_supported_openai_params(model="anthropic.claude-3", custom_llm_provider="bedrock")
|
||||
|
||||
print(response) # ["max_tokens", "tools", "tool_choice", "stream"]
|
||||
```
|
||||
|
||||
This is a list of openai params we translate across providers.
|
||||
|
||||
This list is constantly being updated.
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue