This commit is contained in:
lucca 2024-04-04 16:58:51 -03:00
parent f88545261c
commit be265fbb15
No known key found for this signature in database
GPG key ID: 4AD5F117F068DBC1
5 changed files with 23 additions and 1 deletions

View file

@ -87,6 +87,7 @@
| command-light | cohere | 0.00003 |
| command-medium-beta | cohere | 0.00003 |
| command-xlarge-beta | cohere | 0.00003 |
| command-r-plus| cohere | 0.000018 |
| j2-ultra | ai21 | 0.00003 |
| ai21.j2-ultra-v1 | bedrock | 0.0000376 |
| gpt-4-1106-preview | openai | 0.00004 |

View file

@ -72,7 +72,7 @@ Here's the code for how we format all providers. Let us know how we can improve
| Anthropic | `claude-instant-1`, `claude-instant-1.2`, `claude-2` | [Code](https://github.com/BerriAI/litellm/blob/721564c63999a43f96ee9167d0530759d51f8d45/litellm/llms/anthropic.py#L84)
| OpenAI Text Completion | `text-davinci-003`, `text-curie-001`, `text-babbage-001`, `text-ada-001`, `babbage-002`, `davinci-002`, | [Code](https://github.com/BerriAI/litellm/blob/721564c63999a43f96ee9167d0530759d51f8d45/litellm/main.py#L442)
| Replicate | all model names starting with `replicate/` | [Code](https://github.com/BerriAI/litellm/blob/721564c63999a43f96ee9167d0530759d51f8d45/litellm/llms/replicate.py#L180)
| Cohere | `command-nightly`, `command`, `command-light`, `command-medium-beta`, `command-xlarge-beta` | [Code](https://github.com/BerriAI/litellm/blob/721564c63999a43f96ee9167d0530759d51f8d45/litellm/llms/cohere.py#L115)
| Cohere | `command-nightly`, `command`, `command-light`, `command-medium-beta`, `command-xlarge-beta`, `command-r-plus` | [Code](https://github.com/BerriAI/litellm/blob/721564c63999a43f96ee9167d0530759d51f8d45/litellm/llms/cohere.py#L115)
| Huggingface | all model names starting with `huggingface/` | [Code](https://github.com/BerriAI/litellm/blob/721564c63999a43f96ee9167d0530759d51f8d45/litellm/llms/huggingface_restapi.py#L186)
| OpenRouter | all model names starting with `openrouter/` | [Code](https://github.com/BerriAI/litellm/blob/721564c63999a43f96ee9167d0530759d51f8d45/litellm/main.py#L611)
| AI21 | `j2-mid`, `j2-light`, `j2-ultra` | [Code](https://github.com/BerriAI/litellm/blob/721564c63999a43f96ee9167d0530759d51f8d45/litellm/llms/ai21.py#L107)

View file

@ -47,6 +47,7 @@ for chunk in response:
|------------|----------------|
| command-r | `completion('command-r', messages)` |
| command-light | `completion('command-light', messages)` |
| command-r-plus | `completion('command-r-plus', messages)` |
| command-medium | `completion('command-medium', messages)` |
| command-medium-beta | `completion('command-medium-beta', messages)` |
| command-xlarge-nightly | `completion('command-xlarge-nightly', messages)` |

View file

@ -1189,6 +1189,16 @@
"litellm_provider": "cohere_chat",
"mode": "chat"
},
"command-r-plus": {
"max_tokens": 4096,
"max_input_tokens": 128000,
"max_output_tokens": 4096,
"input_cost_per_token": 0.000003,
"output_cost_per_token": 0.000015,
"litellm_provider": "cohere_chat",
"mode": "chat",
"supports_function_calling": true
},
"command-nightly": {
"max_tokens": 4096,
"max_input_tokens": 4096,

View file

@ -1189,6 +1189,16 @@
"litellm_provider": "cohere_chat",
"mode": "chat"
},
"command-r-plus": {
"max_tokens": 4096,
"max_input_tokens": 128000,
"max_output_tokens": 4096,
"input_cost_per_token": 0.000003,
"output_cost_per_token": 0.000015,
"litellm_provider": "cohere_chat",
"mode": "chat",
"supports_function_calling": true
},
"command-nightly": {
"max_tokens": 4096,
"max_input_tokens": 4096,