forked from phoenix/litellm-mirror
docs(openai_compatible.md): doc on disabling system messages
This commit is contained in:
parent
dad09fdc3d
commit
31dc3cd84f
2 changed files with 16 additions and 1 deletions
|
@ -115,3 +115,18 @@ Here's how to call an OpenAI-Compatible Endpoint with the LiteLLM Proxy Server
|
|||
</TabItem>
|
||||
|
||||
</Tabs>
|
||||
|
||||
|
||||
### Advanced - Disable System Messages
|
||||
|
||||
Some VLLM models (e.g. gemma) don't support system messages. To map those requests to 'user' messages, use the `supports_system_message` flag.
|
||||
|
||||
```yaml
|
||||
model_list:
|
||||
- model_name: my-custom-model
|
||||
litellm_params:
|
||||
model: openai/google/gemma
|
||||
api_base: http://my-custom-base
|
||||
api_key: ""
|
||||
supports_system_message: False # 👈 KEY CHANGE
|
||||
```
|
|
@ -427,7 +427,7 @@ model_list:
|
|||
|
||||
```shell
|
||||
$ litellm --config /path/to/config.yaml
|
||||
```
|
||||
```
|
||||
|
||||
## Setting Embedding Models
|
||||
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue