Ishaan Jaff
e2d612efd9
Bug fix - String data: stripped from entire content in streamed Gemini responses ( #9070 )
...
* _strip_sse_data_from_chunk
* use _strip_sse_data_from_chunk
* use _strip_sse_data_from_chunk
* use _strip_sse_data_from_chunk
* _strip_sse_data_from_chunk
* test_strip_sse_data_from_chunk
* _strip_sse_data_from_chunk
* testing
* _strip_sse_data_from_chunk
2025-03-07 21:06:39 -08:00
Ishaan Jaff
b02af305de
[Feat] - Display thinking
tokens on OpenWebUI (Bedrock, Anthropic, Deepseek) ( #9029 )
...
Read Version from pyproject.toml / read-version (push) Successful in 14s
* if merge_reasoning_content_in_choices
* _optional_combine_thinking_block_in_choices
* stash changes
* working merge_reasoning_content_in_choices with bedrock
* fix litellm_params accessor
* fix streaming handler
* merge_reasoning_content_in_choices
* _optional_combine_thinking_block_in_choices
* test_bedrock_stream_thinking_content_openwebui
* merge_reasoning_content_in_choices
* fix for _optional_combine_thinking_block_in_choices
* linting error fix
2025-03-06 18:32:58 -08:00
Krish Dholakia
94d28d59e4
Fix deepseek 'reasoning_content' error ( #8963 )
...
* fix(streaming_handler.py): fix deepseek reasoning content streaming
Fixes https://github.com/BerriAI/litellm/issues/8939
* test(test_streaming_handler.py): add unit test to streaming handle 'is_chunk_non_empty' function
ensures 'reasoning_content' is handled correctly
2025-03-03 14:34:10 -08:00
Ishaan Jaff
ee7cd60fdb
Revert "(bug fix) - don't log messages, prompt, input in model_parameters
in StandardLoggingPayload ( #8923 )"
...
This reverts commit a119cb420b
.
2025-03-01 11:05:33 -08:00
Ishaan Jaff
6fc9aa1612
(bug fix) - dd tracer, only send traces when user opts into sending dd-trace ( #8928 )
...
* fix dd tracing null tracer bug
* fix dd tracing
* fix base aws llm
* test_should_use_dd_tracer
2025-03-01 10:53:36 -08:00
Ishaan Jaff
a119cb420b
(bug fix) - don't log messages, prompt, input in model_parameters
in StandardLoggingPayload ( #8923 )
...
* fix _get_model_parameters
* test litellm logging
* test litellm logging
2025-03-01 10:27:24 -08:00
Ishaan Jaff
7021f2f244
(Bug fix) dd-trace used by default on litellm proxy ( #8817 )
...
* fix _should_use_dd_tracer
* fix _should_use_dd_tracer
* _should_use_dd_tracer
* _should_use_dd_tracer
* _should_use_dd_tracer
* _init_dd_tracer
* _should_use_dd_tracer
* fix should use dd-tracer
* fix dd tracer
2025-02-25 19:54:22 -08:00
Ishaan Jaff
300d7825f5
(Observability) - Add more detailed dd tracing on Proxy Auth, Bedrock Auth ( #8693 )
...
* add dd tracer
* fix dd tracing
* add @tracer.wrap() on def user_api_key_auth
* add async_function_with_retries
* remove dead code
* add tracer.wrap on base aws llm
* add tracer.wrap on base aws llm
* fix print verbose
* fix dd tracing
* trace base aws llm
* fix test base aws llm
* fix converse transform
* test base aws llm
* BASE_AWS_LLM_PATH
* BASE_AWS_LLM_PATH
* test dd tracing
2025-02-20 18:00:41 -08:00
Ishaan Jaff
bb6f43d12e
(Bug fix) - Cache Health not working when configured with prometheus service logger ( #8687 )
...
* fix serialize on safe json dumps
* test_non_standard_dict_keys_complex
* ui fix HealthCheckCacheParams
* fix HealthCheckCacheParams
* fix code qa
* test_cache_ping_failure
* test_cache_ping_health_check_includes_only_cache_attributes
* test_cache_ping_health_check_includes_only_cache_attributes
2025-02-20 13:41:56 -08:00
Ishaan Jaff
fff15543d9
(UI + Proxy) Cache Health Check Page - Cleanup/Improvements ( #8665 )
...
Read Version from pyproject.toml / read-version (push) Successful in 14s
* fixes for redis cache ping serialization
* fix cache ping check
* fix cache health check ui
* working error details on ui
* ui expand / collapse error
* move cache health check to diff file
* fix displaying error from cache health check
* ui allow copying errors
* ui cache health fixes
* show redis details
* clean up cache health page
* ui polish fixes
* fix error handling on cache health page
* fix redis_cache_params on cache ping response
* error handling
* cache health ping response
* fx error response from cache ping
* parsedLitellmParams
* fix cache health check
* fix cache health page
* cache safely handle json dumps issues
* test caching routes
* test_primitive_types
* fix caching routes
* litellm_mapped_tests
* fix pytest-mock
* fix _serialize
* fix linting on safe dumps
* test_default_max_depth
* pip install "pytest-mock==3.12.0"
* litellm_mapped_tests_coverage
* add readme on new litellm test dir
2025-02-19 19:08:50 -08:00