forked from phoenix/litellm-mirror
(docs) add groq ai
This commit is contained in:
parent
ff815e93a8
commit
602b25972d
3 changed files with 51 additions and 0 deletions
|
@ -209,6 +209,7 @@ curl 'http://0.0.0.0:8000/key/generate' \
|
|||
| [ollama](https://docs.litellm.ai/docs/providers/ollama) | ✅ | ✅ | ✅ | ✅ |
|
||||
| [deepinfra](https://docs.litellm.ai/docs/providers/deepinfra) | ✅ | ✅ | ✅ | ✅ |
|
||||
| [perplexity-ai](https://docs.litellm.ai/docs/providers/perplexity) | ✅ | ✅ | ✅ | ✅ |
|
||||
| [Groq AI](https://docs.litellm.ai/docs/providers/groq) | ✅ | ✅ | ✅ | ✅ |
|
||||
| [anyscale](https://docs.litellm.ai/docs/providers/anyscale) | ✅ | ✅ | ✅ | ✅ |
|
||||
| [voyage ai](https://docs.litellm.ai/docs/providers/voyage) | | | | | ✅ |
|
||||
| [xinference [Xorbits Inference]](https://docs.litellm.ai/docs/providers/xinference) | | | | | ✅ |
|
||||
|
|
49
docs/my-website/docs/providers/groq.md
Normal file
49
docs/my-website/docs/providers/groq.md
Normal file
|
@ -0,0 +1,49 @@
|
|||
# Groq
|
||||
https://groq.com/
|
||||
|
||||
## API Key
|
||||
```python
|
||||
# env variable
|
||||
os.environ['GROQ_API_KEY']
|
||||
```
|
||||
|
||||
## Sample Usage
|
||||
```python
|
||||
from litellm import completion
|
||||
import os
|
||||
|
||||
os.environ['GROQ_API_KEY'] = ""
|
||||
response = completion(
|
||||
model="groq/llama2-70b-4096",
|
||||
messages=[
|
||||
{"role": "user", "content": "hello from litellm"}
|
||||
],
|
||||
)
|
||||
print(response)
|
||||
```
|
||||
|
||||
## Sample Usage - Streaming
|
||||
```python
|
||||
from litellm import completion
|
||||
import os
|
||||
|
||||
os.environ['GROQ_API_KEY'] = ""
|
||||
response = completion(
|
||||
model="groq/llama2-70b-4096",
|
||||
messages=[
|
||||
{"role": "user", "content": "hello from litellm"}
|
||||
],
|
||||
stream=True
|
||||
)
|
||||
|
||||
for chunk in response:
|
||||
print(chunk)
|
||||
```
|
||||
|
||||
|
||||
## Supported Models - ALL Groq Models Supported!
|
||||
We support ALL Groq models, just set `groq/` as a prefix when sending completion requests
|
||||
|
||||
| Model Name | Function Call |
|
||||
|--------------------------|------------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
||||
| llama2-70b-4096 | `completion(model="groq/llama2-70b-4096", messages)` |
|
|
@ -131,6 +131,7 @@ const sidebars = {
|
|||
"providers/bedrock",
|
||||
"providers/anyscale",
|
||||
"providers/perplexity",
|
||||
"providers/groq",
|
||||
"providers/vllm",
|
||||
"providers/xinference",
|
||||
"providers/cloudflare_workers",
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue