(Bug fix) - allow using Assistants GET, DELETE on /openai pass through routes (#8818)

* test_openai_assistants_e2e_operations

* test openai assistants pass through

* fix GET request on pass through handler

* _make_non_streaming_http_request

* _is_assistants_api_request

* test_openai_assistants_e2e_operations

* test_openai_assistants_e2e_operations

* openai_proxy_route

* docs openai pass through

* docs openai pass through

* docs openai pass through

* test pass through handler

* Potential fix for code scanning alert no. 2240: Incomplete URL substring sanitization

Co-authored-by: Copilot Autofix powered by AI <62310815+github-advanced-security[bot]@users.noreply.github.com>

---------

Co-authored-by: Copilot Autofix powered by AI <62310815+github-advanced-security[bot]@users.noreply.github.com>
This commit is contained in:
Ishaan Jaff 2025-02-25 19:19:00 -08:00 committed by GitHub
parent bca6e37c24
commit 81039d8faf
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
8 changed files with 572 additions and 84 deletions

View file

@ -17,6 +17,7 @@ from litellm.proxy._types import (
TeamCallbackMetadata,
UserAPIKeyAuth,
)
from litellm.proxy.auth.route_checks import RouteChecks
from litellm.router import Router
from litellm.types.llms.anthropic import ANTHROPIC_API_HEADERS
from litellm.types.services import ServiceTypes
@ -59,7 +60,7 @@ def _get_metadata_variable_name(request: Request) -> str:
For ALL other endpoints we call this "metadata
"""
if "thread" in request.url.path or "assistant" in request.url.path:
if RouteChecks._is_assistants_api_request(request):
return "litellm_metadata"
if "batches" in request.url.path:
return "litellm_metadata"