mirror of
https://github.com/BerriAI/litellm.git
synced 2025-04-24 18:24:20 +00:00
(docs) add LiteLLM Server - deploy liteLLM
This commit is contained in:
parent
7f90f400c3
commit
895cb5d0f9
1 changed files with 14 additions and 5 deletions
|
@ -5,10 +5,6 @@ import TabItem from '@theme/TabItem';
|
|||
|
||||
https://github.com/BerriAI/litellm
|
||||
|
||||
[](https://l.linklyhq.com/l/1uHtX)
|
||||
[](https://l.linklyhq.com/l/1uHsr)
|
||||
[](https://docs.litellm.ai/docs/simple_proxy#deploy-on-aws-apprunner)
|
||||
|
||||
import QuickStart from '../src/components/QuickStart.js'
|
||||
|
||||
## **Call 100+ LLMs using the same Input/Output Format**
|
||||
|
@ -399,7 +395,20 @@ response = completion(
|
|||
)
|
||||
```
|
||||
|
||||
Need a dedicated key? Email us @ krrish@berri.ai
|
||||
## 💥 LiteLLM Server - Deploy LiteLLM
|
||||
1-Click Deploy
|
||||
A simple, fast, and lightweight OpenAI-compatible server to call 100+ LLM APIs in the OpenAI Input/Output format
|
||||
|
||||
### Server Endpoints:
|
||||
- `/chat/completions` - chat completions endpoint to call 100+ LLMs
|
||||
- `/models` - available models on serve
|
||||
|
||||
👉 Docs: https://docs.litellm.ai/docs/simple_proxy
|
||||
|
||||
[](https://l.linklyhq.com/l/1uHtX)
|
||||
[](https://l.linklyhq.com/l/1uHsr)
|
||||
[](https://docs.litellm.ai/docs/simple_proxy#deploy-on-aws-apprunner)
|
||||
|
||||
|
||||
|
||||
## More details
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue