forked from phoenix/litellm-mirror
docs(hosted.md): add hosted proxy info to docs
This commit is contained in:
parent
5d8af892c7
commit
3ce73fce23
6 changed files with 49 additions and 0 deletions
43
docs/my-website/docs/hosted.md
Normal file
43
docs/my-website/docs/hosted.md
Normal file
|
@ -0,0 +1,43 @@
|
|||
import Image from '@theme/IdealImage';
|
||||
|
||||
# Hosted LiteLLM Proxy
|
||||
|
||||
LiteLLM maintains the proxy, so you can focus on your core products.
|
||||
|
||||
## [**Get Onboarded**](https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat)
|
||||
|
||||
This is in alpha. Schedule a call with us, and we'll give you a hosted proxy within 30 minutes.
|
||||
|
||||
[**🚨 Schedule Call**](https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat)
|
||||
|
||||
### **Status**: Alpha
|
||||
|
||||
Our proxy is already used in production by customers.
|
||||
|
||||
See our status page for [**live reliability**](https://status.litellm.ai/)
|
||||
|
||||
### **Benefits**:
|
||||
- **No Maintenance, No Infra**: We'll maintain the proxy, and spin up any additional infrastructure (e.g.: separate server for spend logs) to make sure you can load balance + track spend across multiple LLM projects.
|
||||
- **Reliable**: Our hosted proxy is tested on 1k requests per second, making it reliable for high load.
|
||||
- **Secure**: LiteLLM is currently undergoing SOC-2 compliance, to make sure your data is as secure as possible.
|
||||
|
||||
## **Screenshots**
|
||||
|
||||
### 1. Create keys
|
||||
|
||||
<Image img={require('../img/litellm_hosted_ui_create_key.png')} />
|
||||
|
||||
### 2. Add Models
|
||||
|
||||
<Image img={require('../img/litellm_hosted_ui_add_models.png')}/>
|
||||
|
||||
### 3. Track spend
|
||||
|
||||
<Image img={require('../img/litellm_hosted_usage_dashboard.png')} />
|
||||
|
||||
|
||||
### 4. Configure load balancing
|
||||
|
||||
<Image img={require('../img/litellm_hosted_ui_router.png')} />
|
||||
|
||||
#### [**🚨 Schedule Call**](https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat)
|
|
@ -105,6 +105,12 @@ const config = {
|
|||
label: 'Enterprise',
|
||||
to: "docs/enterprise"
|
||||
},
|
||||
{
|
||||
sidebarId: 'tutorialSidebar',
|
||||
position: 'left',
|
||||
label: '🚀 Hosted',
|
||||
to: "docs/hosted"
|
||||
},
|
||||
{
|
||||
href: 'https://github.com/BerriAI/litellm',
|
||||
label: 'GitHub',
|
||||
|
|
BIN
docs/my-website/img/litellm_hosted_ui_add_models.png
Normal file
BIN
docs/my-website/img/litellm_hosted_ui_add_models.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 398 KiB |
BIN
docs/my-website/img/litellm_hosted_ui_create_key.png
Normal file
BIN
docs/my-website/img/litellm_hosted_ui_create_key.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 496 KiB |
BIN
docs/my-website/img/litellm_hosted_ui_router.png
Normal file
BIN
docs/my-website/img/litellm_hosted_ui_router.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 348 KiB |
BIN
docs/my-website/img/litellm_hosted_usage_dashboard.png
Normal file
BIN
docs/my-website/img/litellm_hosted_usage_dashboard.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 460 KiB |
Loading…
Add table
Add a link
Reference in a new issue