litellm/docs/my-website/docs/providers/groq.md
2024-03-15 11:50:19 +08:00

1.5 KiB

Groq

https://groq.com/

We support ALL Groq models, just set groq/ as a prefix when sending completion requests

API Key

# env variable
os.environ['GROQ_API_KEY']

Sample Usage

from litellm import completion
import os

os.environ['GROQ_API_KEY'] = ""
response = completion(
    model="groq/llama2-70b-4096", 
    messages=[
       {"role": "user", "content": "hello from litellm"}
   ],
)
print(response)

Sample Usage - Streaming

from litellm import completion
import os

os.environ['GROQ_API_KEY'] = ""
response = completion(
    model="groq/llama2-70b-4096", 
    messages=[
       {"role": "user", "content": "hello from litellm"}
   ],
    stream=True
)

for chunk in response:
    print(chunk)

Supported Models - ALL Groq Models Supported!

We support ALL Groq models, just set groq/ as a prefix when sending completion requests

Model Name Function Call
llama2-70b-4096 completion(model="groq/llama2-70b-4096", messages)
mixtral-8x7b-32768 completion(model="groq/mixtral-8x7b-32768", messages)
gemma-7b-it completion(model="groq/gemma-7b-it", messages)