litellm-mirror/docs/my-website/release_notes/v1.57.7/index.md
Krish Dholakia 792ee079c2
Litellm 04 05 2025 release notes (#9785)
* docs: update docs

* docs: additional cleanup

* docs(index.md): add initial links

* docs: more doc updates

* docs(index.md): add more links

* docs(files.md): add gemini files API to docs

* docs(index.md): add more docs

* docs: more docs

* docs: update docs
2025-04-06 09:03:51 -07:00

2.4 KiB

title slug date authors tags hide_table_of_contents
v1.57.7 v1.57.7 2025-01-10T10:00:00
name title url image_url
Krrish Dholakia CEO, LiteLLM https://www.linkedin.com/in/krish-d/ 1737327772
name title url image_url
Ishaan Jaffer CTO, LiteLLM https://www.linkedin.com/in/reffajnaahsi/ 1675971026
langfuse
management endpoints
ui
prometheus
secret management
false

langfuse, management endpoints, ui, prometheus, secret management

Langfuse Prompt Management

Langfuse Prompt Management is being labelled as BETA. This allows us to iterate quickly on the feedback we're receiving, and making the status clearer to users. We expect to make this feature to be stable by next month (February 2025).

Changes:

  • Include the client message in the LLM API Request. (Previously only the prompt template was sent, and the client message was ignored).
  • Log the prompt template in the logged request (e.g. to s3/langfuse).
  • Log the 'prompt_id' and 'prompt_variables' in the logged request (e.g. to s3/langfuse).

Start Here

Team/Organization Management + UI Improvements

Managing teams and organizations on the UI is now easier.

Changes:

  • Support for editing user role within team on UI.
  • Support updating team member role to admin via api - /team/member_update
  • Show team admins all keys for their team.
  • Add organizations with budgets
  • Assign teams to orgs on the UI
  • Auto-assign SSO users to teams

Start Here

Hashicorp Vault Support

We now support writing LiteLLM Virtual API keys to Hashicorp Vault.

Start Here

Custom Prometheus Metrics

Define custom prometheus metrics, and track usage/latency/no. of requests against them

This allows for more fine-grained tracking - e.g. on prompt template passed in request metadata

Start Here