Litellm dev 04 22 2025 p1 (#10206)

* fix(openai.py): initial commit adding generic event type for openai responses api streaming

Ensures handling for undocumented event types - e.g. "response.reasoning_summary_part.added"

* fix(transformation.py): handle unknown openai response type

* fix(datadog_llm_observability.py): handle dict[str, any] -> dict[str, str] conversion

Fixes https://github.com/BerriAI/litellm/issues/9494

* test: add more unit testing

* test: add unit test

* fix(common_utils.py): fix message with content list

* test: update testing
This commit is contained in:
Krish Dholakia 2025-04-22 23:58:43 -07:00 committed by GitHub
parent f670ebeb2f
commit 217681eb5e
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
12 changed files with 165 additions and 10 deletions

View file

@ -1296,7 +1296,7 @@ class ProxyConfig:
config=config, base_dir=os.path.dirname(os.path.abspath(file_path or ""))
)
verbose_proxy_logger.debug(f"loaded config={json.dumps(config, indent=4)}")
# verbose_proxy_logger.debug(f"loaded config={json.dumps(config, indent=4)}")
return config
def _process_includes(self, config: dict, base_dir: str) -> dict: