mirror of
https://github.com/BerriAI/litellm.git
synced 2025-04-24 18:24:20 +00:00
docs(simple_proxy.md): doc cleanup
This commit is contained in:
parent
6fc0c74878
commit
9f180831fe
1 changed files with 3 additions and 3 deletions
|
@ -8,9 +8,9 @@ LiteLLM Server, is a simple, fast, and lightweight **OpenAI-compatible server**
|
|||
|
||||
LiteLLM Server supports:
|
||||
|
||||
* LLM API Calls in the OpenAI ChatCompletions format
|
||||
* Set custom prompt templates
|
||||
* Caching + Logging capabilities (Redis and Langfuse, respectively)
|
||||
* Call [Huggingface/Bedrock/TogetherAI/etc.](#other-supported-models) in the OpenAI ChatCompletions format
|
||||
* Set custom prompt templates + model-specific configs (temperature, max_tokens, etc.)
|
||||
* Caching (In-memory + Redis)
|
||||
|
||||
[**See Code**](https://github.com/BerriAI/litellm/tree/main/litellm_server)
|
||||
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue