Commit graph

440 commits

Author SHA1 Message Date
Ishaan Jaff
dae5c24b55 get_chat_completion_prompt 2025-03-19 20:50:15 -07:00
Ishaan Jaff
a49bd460e0 get_custom_logger_for_prompt_management 2025-03-19 17:38:24 -07:00
Ishaan Jaff
805542005a CustomPromptManagement 2025-03-19 16:38:59 -07:00
Ishaan Jaff
730f61f476 get_custom_loggers_for_type 2025-03-19 16:36:41 -07:00
Ishaan Jaff
f5ef0c3cb7 fix code quality checks 2025-03-18 22:34:43 -07:00
Ishaan Jaff
0f2e095b6b _arize_otel_logger 2025-03-18 22:19:51 -07:00
Ishaan Jaff
57e5c94360 Merge branch 'main' into litellm_arize_dynamic_logging 2025-03-18 22:13:35 -07:00
Krrish Dholakia
a34cc2031d fix(response_metadata.py): log the litellm_model_name
make it easier to track the model sent to the provider
2025-03-18 17:46:33 -07:00
Krrish Dholakia
453729b6c9 fix(litellm_logging.py): fix update 2025-03-18 17:18:58 -07:00
Ishaan Jaff
f4b7b49e9a fix arize logging 2025-03-18 15:51:58 -07:00
Krrish Dholakia
2c094c2356 fix(get_llm_provider.py): Fixes https://github.com/BerriAI/litellm/issues/9291 2025-03-18 10:48:28 -07:00
Krish Dholakia
cd5024f3b1
Merge pull request #9333 from BerriAI/litellm_dev_03_17_2025_p2
fix(ollama/completions/transformation.py): pass prompt, untemplated o…
2025-03-17 21:48:30 -07:00
Krrish Dholakia
22faf7d232 fix(ollama/completions/transformation.py): pass prompt, untemplated on /completions request
Fixes https://github.com/BerriAI/litellm/issues/6900
2025-03-17 18:35:44 -07:00
Krrish Dholakia
c4b2e0ae3d fix(streaming_handler.py): support logging complete streaming response on cache hit 2025-03-17 18:10:39 -07:00
Krrish Dholakia
dd9e79adbd fix(streaming_handler.py): emit deep copy of completed chunk 2025-03-17 17:26:21 -07:00
Krrish Dholakia
a5b497667c fix(logging_utils.py): revert change 2025-03-16 21:04:41 -07:00
Krrish Dholakia
a99251a4ab fix(streaming_handler.py): raise stop iteration post-finish reason 2025-03-16 20:40:41 -07:00
Krrish Dholakia
bde9ae8a95 fix(litellm_logging.py): remove unused import 2025-03-16 20:24:27 -07:00
Krrish Dholakia
c0a76427d2 fix(streaming_handler.py): pass complete streaming response on completion 2025-03-16 20:22:12 -07:00
Krrish Dholakia
08b297230e fix(streaming_handler.py): return model response on finished chunk 2025-03-16 13:05:46 -07:00
Krrish Dholakia
612d5a284d refactor(litellm_logging.py): delegate returning a complete response to the streaming_handler
Removes incorrect logic for calculating complete streaming response from litellm logging
2025-03-15 09:55:33 -07:00
Krrish Dholakia
dd2c980d5b fix(utils.py): Prevents final chunk w/ usage from being ignored
Fixes https://github.com/BerriAI/litellm/issues/7112
2025-03-15 09:12:14 -07:00
Krrish Dholakia
a9dceacc1b fix(factory.py): reduce ollama pt LOC < 50 2025-03-14 21:10:05 -07:00
Krish Dholakia
59fd58643b
Merge pull request #9261 from briandevvn/fix_ollama_pt
Fix "system" role has become unacceptable in ollama
2025-03-14 20:13:28 -07:00
Krrish Dholakia
f089b1e23f feat(endpoints.py): support adding credentials by model id
Allows user to reuse existing model credentials
2025-03-14 12:32:32 -07:00
Krrish Dholakia
605a4d1121 feat(endpoints.py): enable retrieving existing credentials by model name
Enables reusing existing credentials
2025-03-14 12:02:50 -07:00
Brian Dev
12db28b0af Support 'system' role ollama 2025-03-15 00:55:18 +07:00
Ishaan Jaff
276a7089df
Merge pull request #9220 from BerriAI/litellm_qa_responses_api
[Fixes] Responses API - allow /responses and subpaths as LLM API route + Add exception mapping for responses API
2025-03-13 21:36:59 -07:00
Ishaan Jaff
7827c275ba exception_type 2025-03-13 20:09:32 -07:00
Sunny Wan
f9a5109203
Merge branch 'BerriAI:main' into main 2025-03-13 19:37:22 -04:00
Ishaan Jaff
15d618f5b1 Add exception mapping for responses API 2025-03-13 15:57:58 -07:00
Ishaan Jaff
1ee6b7852f fix exception_type 2025-03-13 15:33:17 -07:00
Krish Dholakia
cff1c1f7d8
Merge branch 'main' into litellm_dev_03_12_2025_p1 2025-03-12 22:14:02 -07:00
Krrish Dholakia
52926408cd feat(credential_accessor.py): fix upserting new credentials via accessor 2025-03-12 19:03:37 -07:00
Krrish Dholakia
738c0b873d fix(azure_ai/transformation.py): support passing api version to azure ai services endpoint
Fixes https://github.com/BerriAI/litellm/issues/7275
2025-03-12 15:16:42 -07:00
Krish Dholakia
2d957a0ed9
Merge branch 'main' into litellm_dev_03_10_2025_p3 2025-03-12 14:56:01 -07:00
Ishaan Jaff
c2dbcb798f working streaming logging + cost tracking 2025-03-12 07:27:53 -07:00
Ishaan Jaff
46bc76d3e6 _get_assembled_streaming_response 2025-03-12 07:21:03 -07:00
Ishaan Jaff
122c11d346 revert to older logging implementation 2025-03-12 07:14:36 -07:00
Ishaan Jaff
fde75a068a working streaming logging 2025-03-12 00:02:39 -07:00
Ishaan Jaff
51dc24a405 _transform_response_api_usage_to_chat_usage 2025-03-11 22:26:44 -07:00
Ishaan Jaff
24cb83b0e4 Response API cost tracking 2025-03-11 22:02:14 -07:00
Krrish Dholakia
9af73f339a test: fix tests 2025-03-11 17:42:36 -07:00
Krrish Dholakia
152bc67d22 refactor(azure.py): working azure client init on audio speech endpoint 2025-03-11 14:19:45 -07:00
Krrish Dholakia
92881ee79e fix: fix linting error 2025-03-10 21:22:00 -07:00
Krrish Dholakia
f56c5ca380 feat: working e2e credential management - support reusing existing credentials 2025-03-10 19:29:24 -07:00
Krrish Dholakia
fdd5ba3084 feat(credential_accessor.py): support loading in credentials from credential_list
Resolves https://github.com/BerriAI/litellm/issues/9114
2025-03-10 17:15:58 -07:00
Krrish Dholakia
bfbe26b91d feat(azure.py): add azure bad request error support 2025-03-10 15:59:06 -07:00
Krrish Dholakia
5f87dc229a feat(openai.py): bubble all error information back to client 2025-03-10 15:27:43 -07:00
Krish Dholakia
f899b828cf
Support openrouter reasoning_content on streaming (#9094)
* feat(convert_dict_to_response.py): support openrouter format of reasoning content

* fix(transformation.py): fix openrouter streaming with reasoning content

Fixes https://github.com/BerriAI/litellm/issues/8193#issuecomment-270892962

* fix: fix type error
2025-03-09 20:03:59 -07:00