litellm-mirror/docs/my-website/release_notes/v1.55.8-stable/index.md
Krish Dholakia 792ee079c2
Litellm 04 05 2025 release notes (#9785)
* docs: update docs

* docs: additional cleanup

* docs(index.md): add initial links

* docs: more doc updates

* docs(index.md): add more links

* docs(files.md): add gemini files API to docs

* docs(index.md): add more docs

* docs: more docs

* docs: update docs
2025-04-06 09:03:51 -07:00

2.5 KiB

title slug date authors tags hide_table_of_contents
v1.55.8-stable v1.55.8-stable 2024-12-22T10:00:00
name title url image_url
Krrish Dholakia CEO, LiteLLM https://www.linkedin.com/in/krish-d/ 1737327772
name title url image_url
Ishaan Jaffer CTO, LiteLLM https://www.linkedin.com/in/reffajnaahsi/ 1675971026
langfuse
fallbacks
new models
azure_storage
false

import Image from '@theme/IdealImage';

v1.55.8-stable

A new LiteLLM Stable release just went out. Here are 5 updates since v1.52.2-stable.

langfuse, fallbacks, new models, azure_storage

<Image img={require('../../img/langfuse_prmpt_mgmt.png')} />

Langfuse Prompt Management

This makes it easy to run experiments or change the specific models gpt-4o to gpt-4o-mini on Langfuse, instead of making changes in your applications. Start here

Control fallback prompts client-side

Claude prompts are different than OpenAI

Pass in prompts specific to model when doing fallbacks. Start here

New Providers / Models

Azure Data Lake Storage Support

Send LLM usage (spend, tokens) data to Azure Data Lake. This makes it easy to consume usage data on other services (eg. Databricks) Start here

Docker Run LiteLLM

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:litellm_stable_release_branch-v1.55.8-stable

Get Daily Updates

LiteLLM ships new releases every day. Follow us on LinkedIn to get daily updates.