Ishaan Jaff
5de101ab7b
[Feat] Add GET, DELETE Responses endpoints on LiteLLM Proxy ( #10297 )
...
* add GET responses endpoints on router
* add GET responses endpoints on router
* add GET responses endpoints on router
* add DELETE responses endpoints on proxy
* fixes for testing GET, DELETE endpoints
* test_basic_responses api e2e
2025-04-24 17:34:26 -07:00
Michael Leshchinsky
e19d05980c
Add litellm call id passing to Aim guardrails on pre and post-hooks calls ( #10021 )
...
Read Version from pyproject.toml / read-version (push) Successful in 16s
Helm unit test / unit-test (push) Successful in 19s
* Add litellm_call_id passing to aim guardrails on pre and post-hooks
* Add test that ensures that pre_call_hook receives litellm call id when common_request_processing called
2025-04-16 07:41:28 -07:00
Krish Dholakia
4a128cfd64
Realtime API Cost tracking ( #9795 )
...
* fix(proxy_server.py): log realtime calls to spendlogs
Fixes https://github.com/BerriAI/litellm/issues/8410
* feat(realtime/): OpenAI Realtime API cost tracking
Closes https://github.com/BerriAI/litellm/issues/8410
* test: add unit testing for coverage
* test: add more unit testing
* fix: handle edge cases
2025-04-07 16:43:12 -07:00
Krish Dholakia
f2a7edaddc
fix(proxy_server.py): Fix "Circular reference detected" error when max_parallel_requests = 0 ( #9671 )
...
* fix(proxy_server.py): remove non-functional parent backoff/retry on /chat/completion
Causes circular reference error
* fix(http_parsing_utils.py): safely return parsed body - don't allow mutation of cached request body by client functions
Root cause fix for circular reference error
* Revert "fix: Anthropic prompt caching on GCP Vertex AI (#9605 )" (#9670 )
This reverts commit a8673246dc
.
* add type hints for AnthropicMessagesResponse
* define types for response form AnthropicMessagesResponse
* fix response typing
* allow using litellm.messages.acreate and litellm.messages.create
* fix anthropic_messages implementation
* add clear type hints to litellm.messages.create functions
* fix anthropic_messages
* working anthropic API tests
* fixes - anthropic messages interface
* use new anthropic interface
* fix code quality check
* docs anthropic messages endpoint
* add namespace_packages = True to mypy
* fix mypy lint errors
* docs anthropic messages interface
* test: fix unit test
* test(test_http_parsing_utils.py): update tests
---------
Co-authored-by: Ishaan Jaff <ishaanjaffer0324@gmail.com>
2025-03-31 22:06:02 -07:00
Krrish Dholakia
943e036851
feat(pass_through_endpoints.py): support returning api-base on pass-through endpoints
...
Make it easier to debug what the api base sent to provider was
2025-03-20 20:11:49 -07:00
Ishaan Jaff
db1f48bbfb
responses_api
2025-03-12 20:38:05 -07:00
Ishaan Jaff
e7e44694cc
fix response api handling
2025-03-12 18:49:17 -07:00
Ishaan Jaff
8689a25396
_handle_llm_api_exception
2025-03-12 17:47:11 -07:00
Ishaan Jaff
f2041762e6
add responses_api
2025-03-12 17:08:16 -07:00
Ishaan Jaff
b2d9935567
use ProxyBaseLLMRequestProcessing
2025-03-12 16:54:33 -07:00