.. |
anthropic_interface
|
docs anthropic messages interface
|
2025-03-31 16:53:39 -07:00 |
assistants
|
fixes for assistans API tests
|
2025-04-19 07:59:53 -07:00 |
batch_completion
|
(code quality) run ruff rule to ban unused imports (#7313)
|
2024-12-19 12:33:42 -08:00 |
batches
|
build(pyproject.toml): add new dev dependencies - for type checking (#9631)
|
2025-03-29 11:02:13 -07:00 |
caching
|
Squashed commit of the following: (#9709)
|
2025-04-02 21:24:54 -07:00 |
experimental_mcp_client
|
fix types on tools.py
|
2025-03-29 16:48:15 -07:00 |
files
|
Support CRUD endpoints for Managed Files (#9924)
|
2025-04-11 21:48:27 -07:00 |
fine_tuning
|
build(pyproject.toml): add new dev dependencies - for type checking (#9631)
|
2025-03-29 11:02:13 -07:00 |
integrations
|
[Bug Fix] Azure Blob Storage fixes (#10059)
|
2025-04-16 09:47:10 -07:00 |
litellm_core_utils
|
test(utils.py): handle scenario where text tokens + reasoning tokens … (#10165)
|
2025-04-19 12:32:38 -07:00 |
llms
|
test(utils.py): handle scenario where text tokens + reasoning tokens … (#10165)
|
2025-04-19 12:32:38 -07:00 |
proxy
|
fix(triton/completion/transformation.py): remove bad_words / stop wor… (#10163)
|
2025-04-19 11:23:37 -07:00 |
realtime_api
|
Realtime API: Support 'base_model' cost tracking + show response in spend logs (if enabled) (#9897)
|
2025-04-10 21:24:45 -07:00 |
rerank_api
|
build(pyproject.toml): add new dev dependencies - for type checking (#9631)
|
2025-03-29 11:02:13 -07:00 |
responses
|
[Feat] Support for all litellm providers on Responses API (works with Codex) - Anthropic, Bedrock API, VertexAI, Ollama (#10132)
|
2025-04-18 19:53:59 -07:00 |
router_strategy
|
build(pyproject.toml): add new dev dependencies - for type checking (#9631)
|
2025-03-29 11:02:13 -07:00 |
router_utils
|
LiteLLM Minor Fixes & Improvements (04/02/2025) (#9725)
|
2025-04-03 11:48:52 -07:00 |
secret_managers
|
Squashed commit of the following: (#9709)
|
2025-04-02 21:24:54 -07:00 |
types
|
Support 'file' message type for VLLM video url's + Anthropic redacted message thinking support (#10129)
|
2025-04-19 11:16:37 -07:00 |
__init__.py
|
[Feat] Unified Responses API - Add Azure Responses API support (#10116)
|
2025-04-17 16:47:59 -07:00 |
_logging.py
|
fix(logging): add json formatting for uncaught exceptions (#9615) (#9619)
|
2025-03-28 15:16:15 -07:00 |
_redis.py
|
Squashed commit of the following: (#9709)
|
2025-04-02 21:24:54 -07:00 |
_service_logger.py
|
clean up service metrics
|
2025-04-02 17:50:41 -07:00 |
_version.py
|
Virtual key based policies in Aim Guardrails (#9499)
|
2025-04-01 21:57:23 -07:00 |
budget_manager.py
|
Squashed commit of the following: (#9709)
|
2025-04-02 21:24:54 -07:00 |
constants.py
|
Gemini-2.5-flash - support reasoning cost calc + return reasoning content (#10141)
|
2025-04-19 09:20:52 -07:00 |
cost.json
|
|
|
cost_calculator.py
|
Realtime API: Support 'base_model' cost tracking + show response in spend logs (if enabled) (#9897)
|
2025-04-10 21:24:45 -07:00 |
exceptions.py
|
feat(openai.py): bubble all error information back to client
|
2025-03-10 15:27:43 -07:00 |
main.py
|
Handle fireworks ai tool calling response (#10130)
|
2025-04-19 09:37:45 -07:00 |
model_prices_and_context_window_backup.json
|
test(utils.py): handle scenario where text tokens + reasoning tokens … (#10165)
|
2025-04-19 12:32:38 -07:00 |
py.typed
|
|
|
router.py
|
Litellm UI qa 04 12 2025 p1 (#9955)
|
2025-04-12 19:30:48 -07:00 |
scheduler.py
|
Squashed commit of the following: (#9709)
|
2025-04-02 21:24:54 -07:00 |
timeout.py
|
Litellm ruff linting enforcement (#5992)
|
2024-10-01 19:44:20 -04:00 |
utils.py
|
Handle fireworks ai tool calling response (#10130)
|
2025-04-19 09:37:45 -07:00 |