Commit graph

7 commits

Author SHA1 Message Date
Krish Dholakia
f2a7edaddc
fix(proxy_server.py): Fix "Circular reference detected" error when max_parallel_requests = 0 (#9671)
* fix(proxy_server.py): remove non-functional parent backoff/retry on /chat/completion

Causes circular reference error

* fix(http_parsing_utils.py): safely return parsed body - don't allow mutation of cached request body by client functions

Root cause fix for circular reference error

* Revert "fix: Anthropic prompt caching on GCP Vertex AI (#9605)" (#9670)

This reverts commit a8673246dc.

* add type hints for AnthropicMessagesResponse

* define types for response form AnthropicMessagesResponse

* fix response typing

* allow using litellm.messages.acreate and litellm.messages.create

* fix anthropic_messages implementation

* add clear type hints to litellm.messages.create functions

* fix anthropic_messages

* working anthropic API tests

* fixes - anthropic messages interface

* use new anthropic interface

* fix code quality check

* docs anthropic messages endpoint

* add namespace_packages = True to mypy

* fix mypy lint errors

* docs anthropic messages interface

* test: fix unit test

* test(test_http_parsing_utils.py): update tests

---------

Co-authored-by: Ishaan Jaff <ishaanjaffer0324@gmail.com>
2025-03-31 22:06:02 -07:00
Krrish Dholakia
943e036851 feat(pass_through_endpoints.py): support returning api-base on pass-through endpoints
Make it easier to debug what the api base sent to provider was
2025-03-20 20:11:49 -07:00
Ishaan Jaff
db1f48bbfb responses_api 2025-03-12 20:38:05 -07:00
Ishaan Jaff
e7e44694cc fix response api handling 2025-03-12 18:49:17 -07:00
Ishaan Jaff
8689a25396 _handle_llm_api_exception 2025-03-12 17:47:11 -07:00
Ishaan Jaff
f2041762e6 add responses_api 2025-03-12 17:08:16 -07:00
Ishaan Jaff
b2d9935567 use ProxyBaseLLMRequestProcessing 2025-03-12 16:54:33 -07:00