litellm-mirror/litellm/types
Krish Dholakia f08a4e3c06
Support 'file' message type for VLLM video url's + Anthropic redacted message thinking support (#10129)
* feat(hosted_vllm/chat/transformation.py): support calling vllm video url with openai 'file' message type

allows switching between gemini/vllm easily

* [WIP] redacted thinking tests (#9044)

* WIP: redacted thinking tests

* test: add test for redacted thinking in assistant message

---------

Co-authored-by: Krish Dholakia <krrishdholakia@gmail.com>

* fix(anthropic/chat/transformation.py): support redacted thinking block on anthropic completion

Fixes https://github.com/BerriAI/litellm/issues/9058

* fix(anthropic/chat/handler.py): transform anthropic redacted messages on streaming

Fixes https://github.com/BerriAI/litellm/issues/9058

* fix(bedrock/): support redacted text on streaming + non-streaming

Fixes https://github.com/BerriAI/litellm/issues/9058

* feat(litellm_proxy/chat/transformation.py): support 'reasoning_effort' param for proxy

allows using reasoning effort with thinking models on proxy

* test: update tests

* fix(utils.py): fix linting error

* fix: fix linting errors

* fix: fix linting errors

* fix: fix linting error

* fix: fix linting errors

* fix(anthropic/chat/transformation.py): fix returning citations in chat completion

---------

Co-authored-by: Johann Miller <22018973+johannkm@users.noreply.github.com>
2025-04-19 11:16:37 -07:00
..
integrations [UI] Allow setting prompt cache_control_injection_points (#10000) 2025-04-14 21:17:42 -07:00
llms Support 'file' message type for VLLM video url's + Anthropic redacted message thinking support (#10129) 2025-04-19 11:16:37 -07:00
mcp_server fix listing mcp tools 2025-03-29 18:40:58 -07:00
passthrough_endpoints Squashed commit of the following: (#9709) 2025-04-02 21:24:54 -07:00
proxy Litellm dev 04 18 2025 p2 (#10157) 2025-04-19 07:32:23 -07:00
responses [Feat] Support for all litellm providers on Responses API (works with Codex) - Anthropic, Bedrock API, VertexAI, Ollama (#10132) 2025-04-18 19:53:59 -07:00
adapter.py feat(anthropic_adapter.py): support for translating anthropic params to openai format 2024-07-10 00:32:28 -07:00
caching.py (bug fix) - Fix Cache Health Check for Redis when redis_version is float (#8979) 2025-03-04 21:26:18 -08:00
completion.py feat(endpoints.py): support adding credentials by model id 2025-03-14 12:32:32 -07:00
embedding.py Removed config dict type definition 2024-05-17 10:39:00 +08:00
files.py LiteLLM Contributor PRs (02/18/2025). (#8643) 2025-02-19 21:52:46 -08:00
fine_tuning.py (feat) POST /fine_tuning/jobs support passing vertex specific hyper params (#7490) 2025-01-01 07:44:48 -08:00
guardrails.py test: update test 2025-02-10 22:13:58 -08:00
rerank.py build(pyproject.toml): add new dev dependencies - for type checking (#9631) 2025-03-29 11:02:13 -07:00
router.py [Feat] Add litellm.supports_reasoning() util to track if an llm supports reasoning (#9923) 2025-04-11 17:56:04 -07:00
scheduler.py Squashed commit of the following: (#9709) 2025-04-02 21:24:54 -07:00
services.py Add aggregate spend by tag (#10071) 2025-04-16 12:26:21 -07:00
tag_management.py Litellm dev 04 18 2025 p2 (#10157) 2025-04-19 07:32:23 -07:00
utils.py Support 'file' message type for VLLM video url's + Anthropic redacted message thinking support (#10129) 2025-04-19 11:16:37 -07:00