forked from phoenix/litellm-mirror
(docs) setting extra_headers
This commit is contained in:
parent
3c8b58bd80
commit
aa9164e2ce
3 changed files with 2 additions and 8 deletions
|
@ -227,6 +227,7 @@ model_list:
|
||||||
- model_name: openai-gpt-3.5
|
- model_name: openai-gpt-3.5
|
||||||
litellm_params:
|
litellm_params:
|
||||||
model: openai/gpt-3.5-turbo
|
model: openai/gpt-3.5-turbo
|
||||||
|
extra_headers: {"AI-Resource Group": "ishaan-resource"}
|
||||||
api_key: sk-123
|
api_key: sk-123
|
||||||
organization: org-ikDc4ex8NB
|
organization: org-ikDc4ex8NB
|
||||||
temperature: 0.2
|
temperature: 0.2
|
||||||
|
@ -234,10 +235,6 @@ model_list:
|
||||||
litellm_params:
|
litellm_params:
|
||||||
model: ollama/mistral
|
model: ollama/mistral
|
||||||
api_base: your_ollama_api_base
|
api_base: your_ollama_api_base
|
||||||
headers: {
|
|
||||||
"HTTP-Referer": "litellm.ai",
|
|
||||||
"X-Title": "LiteLLM Server"
|
|
||||||
}
|
|
||||||
```
|
```
|
||||||
|
|
||||||
**Step 2**: Start server with config
|
**Step 2**: Start server with config
|
||||||
|
|
|
@ -787,10 +787,6 @@ model_list:
|
||||||
litellm_params:
|
litellm_params:
|
||||||
model: ollama/mistral
|
model: ollama/mistral
|
||||||
api_base: your_ollama_api_base
|
api_base: your_ollama_api_base
|
||||||
headers: {
|
|
||||||
"HTTP-Referer": "litellm.ai",
|
|
||||||
"X-Title": "LiteLLM Server"
|
|
||||||
}
|
|
||||||
```
|
```
|
||||||
|
|
||||||
**Step 2**: Start server with config
|
**Step 2**: Start server with config
|
||||||
|
|
|
@ -436,6 +436,7 @@ def completion(
|
||||||
api_version (str, optional): API version (default is None).
|
api_version (str, optional): API version (default is None).
|
||||||
api_key (str, optional): API key (default is None).
|
api_key (str, optional): API key (default is None).
|
||||||
model_list (list, optional): List of api base, version, keys
|
model_list (list, optional): List of api base, version, keys
|
||||||
|
extra_headers (dict, optional): Additional headers to include in the request.
|
||||||
|
|
||||||
LITELLM Specific Params
|
LITELLM Specific Params
|
||||||
mock_response (str, optional): If provided, return a mock completion response for testing or debugging purposes (default is None).
|
mock_response (str, optional): If provided, return a mock completion response for testing or debugging purposes (default is None).
|
||||||
|
|
Loading…
Add table
Add a link
Reference in a new issue