mirror of
https://github.com/BerriAI/litellm.git
synced 2025-04-24 18:24:20 +00:00
docs: cleanup
This commit is contained in:
parent
aa55103486
commit
1ff7625984
3 changed files with 19 additions and 2 deletions
BIN
docs/my-website/img/release_notes/unified_responses_api_rn.png
Normal file
BIN
docs/my-website/img/release_notes/unified_responses_api_rn.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 244 KiB |
BIN
docs/my-website/img/scim_integration.png
Normal file
BIN
docs/my-website/img/scim_integration.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 31 KiB |
|
@ -1,5 +1,5 @@
|
|||
---
|
||||
title: v1.67.0-stable - Unified Responses API
|
||||
title: v1.67.0-stable - SCIM Integration
|
||||
slug: v1.67.0-stable
|
||||
date: 2025-04-19T10:00:00
|
||||
authors:
|
||||
|
@ -21,12 +21,19 @@ import TabItem from '@theme/TabItem';
|
|||
|
||||
## Key Highlights
|
||||
|
||||
- **Team and Tag based usage tracking**: You can now see usage and spend by team and tag at 1M+ spend logs.
|
||||
- **SCIM Integration**: Enables identity providers (Okta, Azure AD, OneLogin, etc.) to automate user and team (group) provisioning, updates, and deprovisioning
|
||||
- **Team and Tag based usage tracking**: You can now see usage and spend by team and tag at 1M+ spend logs.
|
||||
- **Unified Responses API**: Support for calling Anthropic, Gemini, Groq, etc. via OpenAI's new Responses API.
|
||||
|
||||
Let's dive in.
|
||||
|
||||
## SCIM Integration
|
||||
|
||||
<Image img={require('../../img/scim_integration.png')}/>
|
||||
|
||||
This release adds SCIM support to LiteLLM. This allows your SSO provider (Okta, Azure AD, etc) to automatically create/delete users, teams, and memberships on LiteLLM. This means that when you remove a team on your SSO provider, your SSO provider will automatically delete the corresponding team on LiteLLM.
|
||||
|
||||
[Read more](../../docs/tutorials/scim_litellm)
|
||||
## Team and Tag based usage tracking
|
||||
|
||||
<Image img={require('../../img/release_notes/new_team_usage_highlight.jpg')}/>
|
||||
|
@ -41,6 +48,16 @@ This release improves team and tag based usage tracking at 1m+ spend logs, makin
|
|||
|
||||
[Read more](#management-endpoints--ui)
|
||||
|
||||
## Unified Responses API
|
||||
|
||||
This release allows you to call Azure OpenAI, Anthropic, AWS Bedrock, and Google Vertex AI models via the POST /v1/responses endpoint on LiteLLM. This means you can now use popular tools like [OpenAI Codex](https://docs.litellm.ai/docs/tutorials/openai_codex) with your own models.
|
||||
|
||||
<Image img={require('../../img/release_notes/unified_responses_api_rn.png')}/>
|
||||
|
||||
|
||||
[Read more](https://docs.litellm.ai/docs/response_api)
|
||||
|
||||
|
||||
## New Models / Updated Models
|
||||
|
||||
- **OpenAI**
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue