litellm-mirror/tests/litellm/litellm_core_utils
Krish Dholakia 217681eb5e
Litellm dev 04 22 2025 p1 (#10206)
* fix(openai.py): initial commit adding generic event type for openai responses api streaming

Ensures handling for undocumented event types - e.g. "response.reasoning_summary_part.added"

* fix(transformation.py): handle unknown openai response type

* fix(datadog_llm_observability.py): handle dict[str, any] -> dict[str, str] conversion

Fixes https://github.com/BerriAI/litellm/issues/9494

* test: add more unit testing

* test: add unit test

* fix(common_utils.py): fix message with content list

* test: update testing
2025-04-22 23:58:43 -07:00
..
llm_cost_calc test(utils.py): handle scenario where text tokens + reasoning tokens … (#10165) 2025-04-19 12:32:38 -07:00
prompt_templates Litellm dev 04 22 2025 p1 (#10206) 2025-04-22 23:58:43 -07:00
test_core_helpers.py fix(triton/completion/transformation.py): remove bad_words / stop wor… (#10163) 2025-04-19 11:23:37 -07:00
test_dd_tracing.py (bug fix) - dd tracer, only send traces when user opts into sending dd-trace (#8928) 2025-03-01 10:53:36 -08:00
test_litellm_logging.py test: add unit test 2025-03-24 14:45:20 -07:00
test_realtime_streaming.py Realtime API Cost tracking (#9795) 2025-04-07 16:43:12 -07:00
test_safe_json_dumps.py (Bug fix) - Cache Health not working when configured with prometheus service logger (#8687) 2025-02-20 13:41:56 -08:00
test_streaming_chunk_builder_utils.py fix(stream_chunk_builder_utils.py): don't set index on modelresponse (#10063) 2025-04-16 10:11:47 -07:00
test_streaming_handler.py test: fix flaky test 2025-04-07 19:42:58 -07:00