litellm-mirror/litellm/proxy/pass_through_endpoints
Krish Dholakia 501885d653 Litellm code qa common config (#7113)
* feat(base_llm): initial commit for common base config class

Addresses code qa critique https://github.com/andrewyng/aisuite/issues/113#issuecomment-2512369132

* feat(base_llm/): add transform request/response abstract methods to base config class

* feat(cohere-+-clarifai): refactor integrations to use common base config class

* fix: fix linting errors

* refactor(anthropic/): move anthropic + vertex anthropic to use base config

* test: fix xai test

* test: fix tests

* fix: fix linting errors

* test: comment out WIP test

* fix(transformation.py): fix is pdf used check

* fix: fix linting error
2024-12-09 15:58:25 -08:00
..
llm_provider_handlers Litellm code qa common config (#7113) 2024-12-09 15:58:25 -08:00
llm_passthrough_endpoints.py (feat) pass through llm endpoints - add PATCH support (vertex context caching requires for update ops) (#6924) 2024-11-26 14:39:13 -08:00
pass_through_endpoints.py run ci/cd again for new release 2024-11-26 00:26:27 -08:00
streaming_handler.py (fix) pass through endpoints - run logging async + use thread pool executor for sync logging callbacks (#6907) 2024-11-25 22:52:05 -08:00
success_handler.py LiteLLM Minor Fixes & Improvements (12/05/2024) (#7051) 2024-12-06 14:29:53 -08:00
types.py (feat) add usage / cost tracking for Anthropic passthrough routes (#6835) 2024-11-20 17:25:12 -08:00