.. |
anthropic_interface
|
docs anthropic messages interface
|
2025-03-31 16:53:39 -07:00 |
assistants
|
fixes for assistans API tests
|
2025-04-19 07:59:53 -07:00 |
batch_completion
|
(code quality) run ruff rule to ban unused imports (#7313)
|
2024-12-19 12:33:42 -08:00 |
batches
|
build(pyproject.toml): add new dev dependencies - for type checking (#9631)
|
2025-03-29 11:02:13 -07:00 |
caching
|
Squashed commit of the following: (#9709)
|
2025-04-02 21:24:54 -07:00 |
experimental_mcp_client
|
fix types on tools.py
|
2025-03-29 16:48:15 -07:00 |
files
|
Support CRUD endpoints for Managed Files (#9924)
|
2025-04-11 21:48:27 -07:00 |
fine_tuning
|
build(pyproject.toml): add new dev dependencies - for type checking (#9631)
|
2025-03-29 11:02:13 -07:00 |
integrations
|
Add AgentOps Integration to LiteLLM (#9685)
|
2025-04-22 10:29:01 -07:00 |
litellm_core_utils
|
Add AgentOps Integration to LiteLLM (#9685)
|
2025-04-22 10:29:01 -07:00 |
llms
|
Gemini-2.5-flash improvements (#10198)
|
2025-04-21 22:48:00 -07:00 |
openai-responses-starter-app@bf0485467c
|
Litellm docs SCIM (#10174)
|
2025-04-19 18:29:09 -07:00 |
proxy
|
Add global filtering to Users tab (#10195)
|
2025-04-22 13:59:43 -07:00 |
realtime_api
|
Realtime API: Support 'base_model' cost tracking + show response in spend logs (if enabled) (#9897)
|
2025-04-10 21:24:45 -07:00 |
rerank_api
|
build(pyproject.toml): add new dev dependencies - for type checking (#9631)
|
2025-03-29 11:02:13 -07:00 |
responses
|
[Feat] Add Responses API - Routing Affinity logic for sessions (#10193)
|
2025-04-21 20:00:27 -07:00 |
router_strategy
|
build(pyproject.toml): add new dev dependencies - for type checking (#9631)
|
2025-03-29 11:02:13 -07:00 |
router_utils
|
[Feat] Add Responses API - Routing Affinity logic for sessions (#10193)
|
2025-04-21 20:00:27 -07:00 |
secret_managers
|
Squashed commit of the following: (#9709)
|
2025-04-02 21:24:54 -07:00 |
types
|
Add global filtering to Users tab (#10195)
|
2025-04-22 13:59:43 -07:00 |
__init__.py
|
Add AgentOps Integration to LiteLLM (#9685)
|
2025-04-22 10:29:01 -07:00 |
_logging.py
|
fix(logging): add json formatting for uncaught exceptions (#9615) (#9619)
|
2025-03-28 15:16:15 -07:00 |
_redis.py
|
Squashed commit of the following: (#9709)
|
2025-04-02 21:24:54 -07:00 |
_service_logger.py
|
clean up service metrics
|
2025-04-02 17:50:41 -07:00 |
_version.py
|
Virtual key based policies in Aim Guardrails (#9499)
|
2025-04-01 21:57:23 -07:00 |
budget_manager.py
|
Squashed commit of the following: (#9709)
|
2025-04-02 21:24:54 -07:00 |
constants.py
|
Gemini-2.5-flash - support reasoning cost calc + return reasoning content (#10141)
|
2025-04-19 09:20:52 -07:00 |
cost.json
|
|
|
cost_calculator.py
|
Realtime API: Support 'base_model' cost tracking + show response in spend logs (if enabled) (#9897)
|
2025-04-10 21:24:45 -07:00 |
exceptions.py
|
feat(openai.py): bubble all error information back to client
|
2025-03-10 15:27:43 -07:00 |
main.py
|
[Feat] Add infinity embedding support (contributor pr) (#10196)
|
2025-04-21 20:01:29 -07:00 |
model_prices_and_context_window_backup.json
|
fix azure/computer-use-preview native streaming
|
2025-04-22 18:21:06 -07:00 |
py.typed
|
|
|
router.py
|
[Feat] Add Responses API - Routing Affinity logic for sessions (#10193)
|
2025-04-21 20:00:27 -07:00 |
scheduler.py
|
Squashed commit of the following: (#9709)
|
2025-04-02 21:24:54 -07:00 |
timeout.py
|
Litellm ruff linting enforcement (#5992)
|
2024-10-01 19:44:20 -04:00 |
utils.py
|
[Feat] Add infinity embedding support (contributor pr) (#10196)
|
2025-04-21 20:01:29 -07:00 |