docs use (LLM Gateway) in some places

This commit is contained in:
Ishaan Jaff 2024-08-08 17:00:52 -07:00 committed by Krrish Dholakia
parent 74af2041cc
commit b2b505611c
3 changed files with 3 additions and 3 deletions

View file

@ -14,7 +14,7 @@ https://github.com/BerriAI/litellm
## How to use LiteLLM
You can use litellm through either:
1. [LiteLLM Proxy Server](#openai-proxy) - Server to call 100+ LLMs, load balance, cost tracking across projects
1. [LiteLLM Proxy Server](#openai-proxy) - Server (LLM Gateway) to call 100+ LLMs, load balance, cost tracking across projects
2. [LiteLLM python SDK](#basic-usage) - Python Client to call 100+ LLMs, load balance, cost tracking
### When to use LiteLLM Proxy Server

View file

@ -5,7 +5,7 @@ import TabItem from '@theme/TabItem';
# Quick Start
Quick start CLI, Config, Docker
LiteLLM Server manages:
LiteLLM Server (LLM Gateway) manages:
* **Unified Interface**: Calling 100+ LLMs [Huggingface/Bedrock/TogetherAI/etc.](#other-supported-models) in the OpenAI `ChatCompletions` & `Completions` format
* **Cost tracking**: Authentication, Spend Tracking & Budgets [Virtual Keys](https://docs.litellm.ai/docs/proxy/virtual_keys)

View file

@ -24,7 +24,7 @@ const sidebars = {
link: {
type: "generated-index",
title: "💥 LiteLLM Proxy Server",
description: `OpenAI Proxy Server to call 100+ LLMs in a unified interface & track spend, set budgets per virtual key/user`,
description: `OpenAI Proxy Server (LLM Gateway) to call 100+ LLMs in a unified interface & track spend, set budgets per virtual key/user`,
slug: "/simple_proxy",
},
items: [