mirror of
https://github.com/BerriAI/litellm.git
synced 2025-04-25 18:54:30 +00:00
* docs(v1.59.8-stable): add release note * docs(index.md): cleanup new stable release, release notes
2.5 KiB
2.5 KiB
title | slug | date | authors | tags | hide_table_of_contents | ||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
v1.55.8-stable | v1.55.8-stable | 2024-12-22T10:00:00 |
|
|
false |
import Image from '@theme/IdealImage';
v1.55.8-stable
A new LiteLLM Stable release just went out. Here are 5 updates since v1.52.2-stable.
langfuse
, fallbacks
, new models
, azure_storage
<Image img={require('../../img/langfuse_prmpt_mgmt.png')} />
Langfuse Prompt Management
This makes it easy to run experiments or change the specific models gpt-4o
to gpt-4o-mini
on Langfuse, instead of making changes in your applications. Start here
Control fallback prompts client-side
Claude prompts are different than OpenAI
Pass in prompts specific to model when doing fallbacks. Start here
New Providers / Models
- NVIDIA Triton
/infer
endpoint. Start here - Infinity Rerank Models Start here
✨ Azure Data Lake Storage Support
Send LLM usage (spend, tokens) data to Azure Data Lake. This makes it easy to consume usage data on other services (eg. Databricks) Start here
Docker Run LiteLLM
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:litellm_stable_release_branch-v1.55.8-stable
Get Daily Updates
LiteLLM ships new releases every day. Follow us on LinkedIn to get daily updates.