mirror of
https://github.com/BerriAI/litellm.git
synced 2025-04-24 18:24:20 +00:00
(docs) proxy server
This commit is contained in:
parent
28a192df3a
commit
94c1d71b2c
1 changed files with 7 additions and 6 deletions
|
@ -74,6 +74,13 @@ print(response)
|
|||
|
||||
</Tabs>
|
||||
|
||||
### Server Endpoints
|
||||
- POST `/chat/completions` - chat completions endpoint to call 100+ LLMs
|
||||
- POST `/completions` - completions endpoint
|
||||
- POST `/embeddings` - embedding endpoint for Azure, OpenAI, Huggingface endpoints
|
||||
- GET `/models` - available models on server
|
||||
- POST `/key/generate` - generate a key to access the proxy
|
||||
|
||||
### Supported LLMs
|
||||
All LiteLLM supported LLMs are supported on the Proxy. Seel all [supported llms](https://docs.litellm.ai/docs/providers)
|
||||
<Tabs>
|
||||
|
@ -225,12 +232,6 @@ $ litellm --model command-nightly
|
|||
|
||||
</Tabs>
|
||||
|
||||
### Server Endpoints
|
||||
- POST `/chat/completions` - chat completions endpoint to call 100+ LLMs
|
||||
- POST `/completions` - completions endpoint
|
||||
- POST `/embeddings` - embedding endpoint for Azure, OpenAI, Huggingface endpoints
|
||||
- GET `/models` - available models on server
|
||||
|
||||
|
||||
## Using with OpenAI compatible projects
|
||||
LiteLLM allows you to set `openai.api_base` to the proxy server and use all LiteLLM supported LLMs in any OpenAI supported project
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue