litellm-mirror/litellm/types
2025-04-21 19:38:55 -07:00
..
integrations [UI] Allow setting prompt cache_control_injection_points (#10000) 2025-04-14 21:17:42 -07:00
llms fix(transformation.py): pass back in gemini thinking content to api (#10173) 2025-04-19 18:03:05 -07:00
mcp_server fix listing mcp tools 2025-03-29 18:40:58 -07:00
passthrough_endpoints Squashed commit of the following: (#9709) 2025-04-02 21:24:54 -07:00
proxy Litellm dev 04 18 2025 p2 (#10157) 2025-04-19 07:32:23 -07:00
responses [Feat] Support for all litellm providers on Responses API (works with Codex) - Anthropic, Bedrock API, VertexAI, Ollama (#10132) 2025-04-18 19:53:59 -07:00
adapter.py feat(anthropic_adapter.py): support for translating anthropic params to openai format 2024-07-10 00:32:28 -07:00
caching.py (bug fix) - Fix Cache Health Check for Redis when redis_version is float (#8979) 2025-03-04 21:26:18 -08:00
completion.py feat(endpoints.py): support adding credentials by model id 2025-03-14 12:32:32 -07:00
embedding.py Removed config dict type definition 2024-05-17 10:39:00 +08:00
files.py LiteLLM Contributor PRs (02/18/2025). (#8643) 2025-02-19 21:52:46 -08:00
fine_tuning.py (feat) POST /fine_tuning/jobs support passing vertex specific hyper params (#7490) 2025-01-01 07:44:48 -08:00
guardrails.py test: update test 2025-02-10 22:13:58 -08:00
rerank.py build(pyproject.toml): add new dev dependencies - for type checking (#9631) 2025-03-29 11:02:13 -07:00
router.py ResponsesApiDeploymentCheck 2025-04-21 16:39:42 -07:00
scheduler.py Squashed commit of the following: (#9709) 2025-04-02 21:24:54 -07:00
services.py Add aggregate spend by tag (#10071) 2025-04-16 12:26:21 -07:00
tag_management.py Litellm dev 04 18 2025 p2 (#10157) 2025-04-19 07:32:23 -07:00
utils.py simplify ResponsesApiDeploymentCheck 2025-04-21 19:38:55 -07:00