litellm-mirror/litellm
Krish Dholakia d71aa2284b fix(litellm_proxy_extras): add baselining db script (#9942)
* fix(litellm_proxy_extras): add baselining db script

Fixes https://github.com/BerriAI/litellm/issues/9885

* fix(prisma_client.py): fix ruff errors

* ci(config.yml): add publish_proxy_extras step

* fix(config.yml): compare contents between versions to check for changes

* fix(config.yml): fix check

* fix: install toml

* fix: update check

* fix: ensure versions in sync

* fix: fix version compare

* fix: correct the cost for 'gemini/gemini-2.5-pro-preview-03-25' (#9896)

* fix: Typo in the cost 'gemini/gemini-2.5-pro-preview-03-25', closes #9854

* chore: update in backup file as well

* Litellm add managed files db (#9930)

* fix(openai.py): ensure openai file object shows up on logs

* fix(managed_files.py): return unified file id as b64 str

allows retrieve file id to work as expected

* fix(managed_files.py): apply decoded file id transformation

* fix: add unit test for file id + decode logic

* fix: initial commit for litellm_proxy support with CRUD Endpoints

* fix(managed_files.py): support retrieve file operation

* fix(managed_files.py): support for DELETE endpoint for files

* fix(managed_files.py): retrieve file content support

supports retrieve file content api from openai

* fix: fix linting error

* test: update tests

* fix: fix linting error

* feat(managed_files.py): support reading / writing files in DB

* feat(managed_files.py): support deleting file from DB on delete

* test: update testing

* fix(spend_tracking_utils.py): ensure each file create request is logged correctly

* fix(managed_files.py): fix storing / returning managed file object from cache

* fix(files/main.py): pass litellm params to azure route

* test: fix test

* build: add new prisma migration

* build: bump requirements

* test: add more testing

* refactor: cleanup post merge w/ main

* fix: fix code qa errors

* [DB / Infra] Add new column team_member_permissions  (#9941)

* add team_member_permissions to team table

* add migration.sql file

* fix poetry lock

* fix prisma migrations

* fix poetry lock

* fix migration

* ui new build

* fix(factory.py): correct indentation for message index increment in ollama,  This fixes bug #9822 (#9943)

* fix(factory.py): correct indentation for message index increment in ollama_pt function

* test: add unit tests for ollama_pt function handling various message types

* ci: update test

* fix: fix check

* ci: see what dir looks like

* ci: more checks

* ci: fix filepath

* ci: cleanup

* ci: fix ci

---------

Co-authored-by: Nilanjan De <nilanjan.de@gmail.com>
Co-authored-by: Ishaan Jaff <ishaanjaffer0324@gmail.com>
Co-authored-by: Dan Shaw <dan@danieljshaw.com>
2025-04-12 10:29:34 -07:00
..
anthropic_interface docs anthropic messages interface 2025-03-31 16:53:39 -07:00
assistants refactor(azure.py): refactor to have client init work across all endpoints 2025-03-11 17:27:24 -07:00
batch_completion (code quality) run ruff rule to ban unused imports (#7313) 2024-12-19 12:33:42 -08:00
batches build(pyproject.toml): add new dev dependencies - for type checking (#9631) 2025-03-29 11:02:13 -07:00
caching Squashed commit of the following: (#9709) 2025-04-02 21:24:54 -07:00
experimental_mcp_client fix types on tools.py 2025-03-29 16:48:15 -07:00
files Support CRUD endpoints for Managed Files (#9924) 2025-04-11 21:48:27 -07:00
fine_tuning build(pyproject.toml): add new dev dependencies - for type checking (#9631) 2025-03-29 11:02:13 -07:00
integrations Support CRUD endpoints for Managed Files (#9924) 2025-04-11 21:48:27 -07:00
litellm_core_utils fix(factory.py): correct indentation for message index increment in ollama, This fixes bug #9822 (#9943) 2025-04-12 09:50:40 -07:00
llms fix(transformation.py): correctly translate 'thinking' param for lite… (#9904) 2025-04-11 23:25:13 -07:00
proxy fix(litellm_proxy_extras): add baselining db script (#9942) 2025-04-12 10:29:34 -07:00
realtime_api Realtime API: Support 'base_model' cost tracking + show response in spend logs (if enabled) (#9897) 2025-04-10 21:24:45 -07:00
rerank_api build(pyproject.toml): add new dev dependencies - for type checking (#9631) 2025-03-29 11:02:13 -07:00
responses build(pyproject.toml): add new dev dependencies - for type checking (#9631) 2025-03-29 11:02:13 -07:00
router_strategy build(pyproject.toml): add new dev dependencies - for type checking (#9631) 2025-03-29 11:02:13 -07:00
router_utils LiteLLM Minor Fixes & Improvements (04/02/2025) (#9725) 2025-04-03 11:48:52 -07:00
secret_managers Squashed commit of the following: (#9709) 2025-04-02 21:24:54 -07:00
types Litellm add managed files db (#9930) 2025-04-12 08:24:46 -07:00
__init__.py [Feat] Add litellm.supports_reasoning() util to track if an llm supports reasoning (#9923) 2025-04-11 17:56:04 -07:00
_logging.py fix(logging): add json formatting for uncaught exceptions (#9615) (#9619) 2025-03-28 15:16:15 -07:00
_redis.py Squashed commit of the following: (#9709) 2025-04-02 21:24:54 -07:00
_service_logger.py clean up service metrics 2025-04-02 17:50:41 -07:00
_version.py Virtual key based policies in Aim Guardrails (#9499) 2025-04-01 21:57:23 -07:00
budget_manager.py Squashed commit of the following: (#9709) 2025-04-02 21:24:54 -07:00
constants.py [Feat] Emit Key, Team Budget metrics on a cron job schedule (#9528) 2025-04-10 16:59:14 -07:00
cost.json
cost_calculator.py Realtime API: Support 'base_model' cost tracking + show response in spend logs (if enabled) (#9897) 2025-04-10 21:24:45 -07:00
exceptions.py feat(openai.py): bubble all error information back to client 2025-03-10 15:27:43 -07:00
main.py VertexAI non-jsonl file storage support (#9781) 2025-04-09 14:01:48 -07:00
model_prices_and_context_window_backup.json fix: correct the cost for 'gemini/gemini-2.5-pro-preview-03-25' (#9896) 2025-04-12 08:20:04 -07:00
py.typed
router.py Support CRUD endpoints for Managed Files (#9924) 2025-04-11 21:48:27 -07:00
scheduler.py Squashed commit of the following: (#9709) 2025-04-02 21:24:54 -07:00
timeout.py Litellm ruff linting enforcement (#5992) 2024-10-01 19:44:20 -04:00
utils.py [Feat] Add litellm.supports_reasoning() util to track if an llm supports reasoning (#9923) 2025-04-11 17:56:04 -07:00