(docs) proxy

This commit is contained in:
ishaan-jaff 2023-11-09 10:38:19 -08:00
parent a44aa8c10c
commit 2c67bda137

View file

@ -170,7 +170,7 @@ $ litellm --model command-nightly
- GET `/models` - available models on server - GET `/models` - available models on server
### Using with OpenAI compatible projects ## Using with OpenAI compatible projects
LiteLLM allows you to set `openai.api_base` to the proxy server and use all LiteLLM supported LLMs in any OpenAI supported project LiteLLM allows you to set `openai.api_base` to the proxy server and use all LiteLLM supported LLMs in any OpenAI supported project
<Tabs> <Tabs>
@ -341,7 +341,7 @@ print(result)
</TabItem> </TabItem>
</Tabs> </Tabs>
## Advanced ## Proxy Configs
### Set Custom Prompt Templates ### Set Custom Prompt Templates