litellm-mirror/litellm/llms/custom_httpx
Krish Dholakia 6e35f7ae4d Fix calling claude via invoke route + response_format support for claude on invoke route (#8908)
* fix(anthropic_claude3_transformation.py): fix amazon anthropic claude 3 tool calling transformation on invoke route

move to using anthropic config as base

* fix(utils.py): expose anthropic config via providerconfigmanager

* fix(llm_http_handler.py): support json mode on async completion calls

* fix(invoke_handler/make_call): support json mode for anthropic called via bedrock invoke

* fix(anthropic/): handle 'response_format: {"type": "text"}` + migrate amazon claude 3 invoke config to inherit from anthropic config

Prevents error when passing in 'response_format: {"type": "text"}

* test: fix test

* fix(utils.py): fix base invoke provider check

* fix(anthropic_claude3_transformation.py): don't pass 'stream' param

* fix: fix linting errors

* fix(converse_transformation.py): handle response_format type=text for converse
2025-02-28 17:56:26 -08:00
..
aiohttp_handler.py [BETA] Add OpenAI /images/variations + Topaz API support (#7700) 2025-01-11 23:27:46 -08:00
http_handler.py Litellm dev 02 07 2025 p2 (#8377) 2025-02-07 17:30:38 -08:00
httpx_handler.py Ollama ssl verify = False + Spend Logs reliability fixes (#7931) 2025-01-23 23:05:41 -08:00
llm_http_handler.py Fix calling claude via invoke route + response_format support for claude on invoke route (#8908) 2025-02-28 17:56:26 -08:00