update docs

This commit is contained in:
Krrish Dholakia 2023-09-20 08:47:14 -07:00
parent ac5fcb673b
commit 8094301424
6 changed files with 100 additions and 24 deletions

View file

@ -5,11 +5,30 @@ LiteLLM supports j2-light, j2-mid and j2-ultra from [AI21](https://www.ai21.com/
They're available to use without a waitlist.
### API KEYS
We provide a free $10 community-key for testing all providers on LiteLLM. You can replace this with your own key.
```python
import os
os.environ["AI21_API_KEY"] = ""
os.environ["AI21_API_KEY"] = "sk-litellm-7_NPZhMGxY2GoHC59LgbDw" # [OPTIONAL] replace with your ai21 key
```
### Sample Usage
```python
from litellm import completion
# set env variable - [OPTIONAL] replace with your ai21 key
os.environ["AI21_API_KEY"] = "sk-litellm-7_NPZhMGxY2GoHC59LgbDw"
messages = [{"role": "user", "content": "Write me a poem about the blue sky"}]
completion(model="j2-light", messages=messages)
```
**Need a dedicated key?**
Email us @ krrish@berri.ai
[**See all supported models by the litellm api key**](../proxy_api.md#supported-models-for-litellm-key)
### AI21 Models
| Model Name | Function Call | Required OS Variables |

View file

@ -1,11 +1,43 @@
# Anthropic
LiteLLM supports Claude-1, 1.2 and Claude-2.
### API KEYS
## API Keys
We provide a free $10 community-key for testing all providers on LiteLLM. You can replace this with your own key.
```python
import os
os.environ["ANTHROPIC_API_KEY"] = "sk-litellm-7_NPZhMGxY2GoHC59LgbDw" # [OPTIONAL] replace with your anthropic key
```
## Sample Usage
```python
import os
from litellm import completion
os.environ["ANTHROPIC_API_KEY"] = ""
# set env - [OPTIONAL] replace with your anthropic key
os.environ["ANTHROPIC_API_KEY"] = "sk-litellm-7_NPZhMGxY2GoHC59LgbDw"
messages = [{"role": "user", "content": "Hey! how's it going?"}]
response = completion(model="claude-instant-1", messages=messages)
print(response)
```
## streaming
Just set `stream=True` when calling completion.
```python
import os
from litellm import completion
# set env - [OPTIONAL] replace with your anthropic key
os.environ["ANTHROPIC_API_KEY"] = "sk-litellm-7_NPZhMGxY2GoHC59LgbDw"
messages = [{"role": "user", "content": "Hey! how's it going?"}]
response = completion(model="claude-instant-1", messages=messages, stream=True)
for chunk in response:
print(chunk["choices"][0]["delta"]["content"]) # same as openai format
```

View file

@ -2,14 +2,23 @@
LiteLLM supports all LLMs on NLP Cloud.
## quick start
## API Keys
We provide a free $10 community-key for testing all providers on LiteLLM. You can replace this with your own key.
```python
import os
os.environ["NLP_CLOUD_API_KEY"] = "sk-litellm-7_NPZhMGxY2GoHC59LgbDw" # [OPTIONAL] replace with your nlp cloud key
```
## Sample Usage
```python
import os
from litellm import completion
# set env
os.environ["NLP_CLOUD_API_KEY"] = "your-key"
# set env - [OPTIONAL] replace with your nlp cloud key
os.environ["NLP_CLOUD_API_KEY"] = "sk-litellm-7_NPZhMGxY2GoHC59LgbDw"
messages = [{"role": "user", "content": "Hey! how's it going?"}]
response = completion(model="dolphin", messages=messages)
@ -23,8 +32,8 @@ Just set `stream=True` when calling completion.
import os
from litellm import completion
# set env
os.environ["NLP_CLOUD_API_KEY"] = "your-key"
# set env - [OPTIONAL] replace with your nlp cloud key
os.environ["NLP_CLOUD_API_KEY"] = "sk-litellm-7_NPZhMGxY2GoHC59LgbDw"
messages = [{"role": "user", "content": "Hey! how's it going?"}]
response = completion(model="dolphin", messages=messages, stream=True)
@ -43,8 +52,8 @@ If you're trying to call any other model (e.g. GPT-J, Llama-2, etc.) with nlp cl
import os
from litellm import completion
# set env
os.environ["NLP_CLOUD_API_KEY"] = "your-key"
# set env - [OPTIONAL] replace with your nlp cloud key
os.environ["NLP_CLOUD_API_KEY"] = "sk-litellm-7_NPZhMGxY2GoHC59LgbDw"
messages = [{"role": "user", "content": "Hey! how's it going?"}]

View file

@ -1,11 +1,13 @@
# OpenAI
LiteLLM supports OpenAI Chat + Text completion and embedding calls.
### API KEYS
### API Keys
We provide a free $10 community-key for testing all providers on LiteLLM. You can replace this with your own key.
```python
import os
os.environ["OPENAI_API_KEY"] = ""
os.environ["OPENAI_API_KEY"] = "sk-litellm-7_NPZhMGxY2GoHC59LgbDw" # [OPTIONAL] replace with your openai key
```
### Usage
@ -13,7 +15,7 @@ os.environ["OPENAI_API_KEY"] = ""
import os
from litellm import completion
os.environ["OPENAI_API_KEY"] = ""
os.environ["OPENAI_API_KEY"] = "sk-litellm-7_NPZhMGxY2GoHC59LgbDw" # [OPTIONAL] replace with your openai key
messages = [{ "content": "Hello, how are you?","role": "user"}]
@ -22,6 +24,11 @@ messages = [{ "content": "Hello, how are you?","role": "user"}]
response = completion("gpt-3.5-turbo", messages)
```
**Need a dedicated key?**
Email us @ krrish@berri.ai
[**See all supported models by the litellm api key**](../proxy_api.md#supported-models-for-litellm-key)
### OpenAI Chat Completion Models
| Model Name | Function Call | Required OS Variables |

View file

@ -1,11 +1,12 @@
# Together AI
LiteLLM supports all models on Together AI.
## API KEYS
## API Keys
We provide a free $10 community-key for testing all providers on LiteLLM. You can replace this with your own key.
```python
import os
os.environ["TOGETHERAI_API_KEY"] = ""
os.environ["TOGETHERAI_API_KEY"] = "sk-litellm-7_NPZhMGxY2GoHC59LgbDw" # [OPTIONAL] replace with your together ai key
```
## Sample Usage
@ -13,14 +14,19 @@ os.environ["TOGETHERAI_API_KEY"] = ""
```python
from litellm import completion
# set env variable
os.environ["TOGETHERAI_API_KEY"] = ""
# set env variable - [OPTIONAL] replace with your together ai key
os.environ["TOGETHERAI_API_KEY"] = "sk-litellm-7_NPZhMGxY2GoHC59LgbDw"
messages = [{"role": "user", "content": "Write me a poem about the blue sky"}]
completion(model="together_ai/togethercomputer/Llama-2-7B-32K-Instruct", messages=messages)
```
**Need a dedicated key?**
Email us @ krrish@berri.ai
[**See all supported models by the litellm api key**](../proxy_api.md#supported-models-for-litellm-key)
## Together AI Models
liteLLM supports `non-streaming` and `streaming` requests to all models on https://api.together.xyz/

View file

@ -23,13 +23,16 @@ response = completion("command-nightly", messages)
**Need a dedicated key?**
Email us @ krrish@berri.ai
## Supported Models
## Supported Models for LiteLLM Key
These are the models that currently work with the "sk-litellm-.." keys.
* OpenAI models
For a complete list of models/providers that you can call with LiteLLM, [check out our provider list](./providers/)
* OpenAI models - [OpenAI docs](./providers/openai.md)
* gpt-4
* gpt-3.5-turbo
* gpt-3.5-turbo-16k
* Llama2 models
* Llama2 models - [TogetherAI docs](./providers/togetherai.md)
* togethercomputer/llama-2-70b-chat
* togethercomputer/llama-2-70b
* togethercomputer/LLaMA-2-7B-32K
@ -38,17 +41,17 @@ Email us @ krrish@berri.ai
* togethercomputer/CodeLlama-34b
* WizardLM/WizardCoder-Python-34B-V1.0
* NousResearch/Nous-Hermes-Llama2-13b
* Falcon models
* Falcon models - [TogetherAI docs](./providers/togetherai.md)
* togethercomputer/falcon-40b-instruct
* togethercomputer/falcon-7b-instruct
* Jurassic/AI21 models
* Jurassic/AI21 models - [AI21 docs](./providers/ai21.md)
* j2-ultra
* j2-mid
* j2-light
* NLP Cloud models
* NLP Cloud models - [NLPCloud docs](./providers/nlp_cloud.md)
* dolpin
* chatdolphin
* Anthropic models
* Anthropic models - [Anthropic docs](./providers/anthropic.md)
* claude-2
* claude-instant-v1