mirror of
https://github.com/BerriAI/litellm.git
synced 2025-04-25 10:44:24 +00:00
[Feat] Support for all litellm providers on Responses API (works with Codex) - Anthropic, Bedrock API, VertexAI, Ollama (#10132)
* transform request * basic handler for LiteLLMCompletionTransformationHandler * complete transform litellm to responses api * fixes to test * fix stream=True * fix streaming iterator * fixes for transformation * fixes for anthropic codex support * fix pass response_api_optional_params * test anthropic responses api tools * update responses types * working codex with litellm * add session handler * fixes streaming iterator * fix handler * add litellm codex example * fix code quality * test fix * docs litellm codex * litellm codexdoc * docs openai codex with litellm * docs litellm openai codex * litellm codex * linting fixes for transforming responses API * fix import error * fix responses api test * add sync iterator support for responses api
This commit is contained in:
parent
3e87ec4f16
commit
3d5022bd79
14 changed files with 1282 additions and 53 deletions
|
@ -68,16 +68,16 @@ def validate_responses_api_response(response, final_chunk: bool = False):
|
|||
"metadata": dict,
|
||||
"model": str,
|
||||
"object": str,
|
||||
"temperature": (int, float),
|
||||
"temperature": (int, float, type(None)),
|
||||
"tool_choice": (dict, str),
|
||||
"tools": list,
|
||||
"top_p": (int, float),
|
||||
"top_p": (int, float, type(None)),
|
||||
"max_output_tokens": (int, type(None)),
|
||||
"previous_response_id": (str, type(None)),
|
||||
"reasoning": dict,
|
||||
"status": str,
|
||||
"text": ResponseTextConfig,
|
||||
"truncation": str,
|
||||
"truncation": (str, type(None)),
|
||||
"usage": ResponseAPIUsage,
|
||||
"user": (str, type(None)),
|
||||
}
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue