forked from phoenix/litellm-mirror
fix docs
This commit is contained in:
parent
abb3793e50
commit
577c1e2dac
1 changed files with 1 additions and 1 deletions
|
@ -25,7 +25,7 @@ In order to use your custom OpenAI Chat Completion proxy with LiteLLM, ensure yo
|
|||
* `custom_llm_provider` to `openai` this ensures litellm uses the `openai.ChatCompletion` to your api_base
|
||||
|
||||
```python
|
||||
|
||||
import os
|
||||
from litellm import completion
|
||||
|
||||
## set ENV variables
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue