litellm-mirror/litellm/responses
Ishaan Jaff 0717369ae6
[Feat] Expose Responses API on LiteLLM UI Test Key Page (#10166)
* add /responses API on UI

* add makeOpenAIResponsesRequest

* add makeOpenAIResponsesRequest

* fix add responses API on UI

* fix endpoint selector

* responses API render chunks on litellm chat ui

* fixes to streaming iterator

* fix render responses completed events

* fixes for MockResponsesAPIStreamingIterator

* transform_responses_api_request_to_chat_completion_request

* fix for responses API

* test_basic_openai_responses_api_streaming

* fix base responses api tests
2025-04-19 13:18:54 -07:00
..
litellm_completion_transformation [Feat] Expose Responses API on LiteLLM UI Test Key Page (#10166) 2025-04-19 13:18:54 -07:00
main.py [Feat] Support for all litellm providers on Responses API (works with Codex) - Anthropic, Bedrock API, VertexAI, Ollama (#10132) 2025-04-18 19:53:59 -07:00
streaming_iterator.py [Feat] Expose Responses API on LiteLLM UI Test Key Page (#10166) 2025-04-19 13:18:54 -07:00
utils.py [Feat] Unified Responses API - Add Azure Responses API support (#10116) 2025-04-17 16:47:59 -07:00