This commit is contained in:
ishaan-jaff 2023-09-16 13:01:11 -07:00
parent ed94410ef5
commit 33693615ed

View file

@ -1,16 +1,16 @@
# /get model context window & cost per token
# get context window & cost per token
For every LLM LiteLLM allows you to:
* Get model context window
* Get cost per token
## LiteLLM API api.litellm.ai
## using api.litellm.ai
Usage
```curl
curl 'https://api.litellm.ai/get_max_tokens?model=claude-2'
```
### Output
### output
```json
{
"input_cost_per_token": 1.102e-05,
@ -20,9 +20,19 @@ curl 'https://api.litellm.ai/get_max_tokens?model=claude-2'
}
```
## LiteLLM Package
## using the litellm python package
Usage
```python
import litellm
model_data = litellm.model_cost["gpt-4"]
```
### output
```json
{
"input_cost_per_token": 3e-06,
"max_tokens": 8192,
"model": "gpt-4",
"output_cost_per_token": 6e-05
}
```