docs cleanup

This commit is contained in:
Ishaan Jaff 2024-08-17 15:59:23 -07:00
parent eff874bf05
commit d9c91838ce
2 changed files with 6 additions and 5 deletions

View file

@ -17,11 +17,11 @@ You can use litellm through either:
1. [LiteLLM Proxy Server](#openai-proxy) - Server (LLM Gateway) to call 100+ LLMs, load balance, cost tracking across projects
2. [LiteLLM python SDK](#basic-usage) - Python Client to call 100+ LLMs, load balance, cost tracking
### When to use LiteLLM Proxy Server
### **When to use LiteLLM Proxy Server (LLM Gateway)**
:::tip
Use LiteLLM Proxy Server if you want a **central service to access multiple LLMs**
Use LiteLLM Proxy Server if you want a **central service (LLM Gateway) to access multiple LLMs**
Typically used by Gen AI Enablement / ML PLatform Teams
@ -31,7 +31,7 @@ Typically used by Gen AI Enablement / ML PLatform Teams
- Track LLM Usage and setup guardrails
- Customize Logging, Guardrails, Caching per project
### When to use LiteLLM Python SDK
### **When to use LiteLLM Python SDK**
:::tip
@ -44,6 +44,7 @@ Typically used by developers building llm projects
- LiteLLM SDK gives you a unified interface to access multiple LLMs (100+ LLMs)
- Retry/fallback logic across multiple deployments (e.g. Azure/OpenAI) - [Router](https://docs.litellm.ai/docs/routing)
## **LiteLLM Python SDK**
### Basic usage
@ -383,7 +384,7 @@ response = completion(
)
```
## OpenAI Proxy
## **LiteLLM Proxy Server (LLM Gateway)**
Track spend across multiple projects/people

View file

@ -23,7 +23,7 @@ const sidebars = {
label: "💥 LiteLLM Proxy Server",
link: {
type: "generated-index",
title: "💥 LiteLLM Proxy Server",
title: "💥 LiteLLM Proxy Server (LLM Gateway)",
description: `OpenAI Proxy Server (LLM Gateway) to call 100+ LLMs in a unified interface & track spend, set budgets per virtual key/user`,
slug: "/simple_proxy",
},