Ishaan Jaff
|
66b9a4a686
|
get_chat_completion_prompt
|
2025-03-19 20:50:15 -07:00 |
|
Ishaan Jaff
|
4db3ca70e4
|
get_custom_logger_for_prompt_management
|
2025-03-19 17:38:24 -07:00 |
|
Ishaan Jaff
|
40e1ccf339
|
CustomPromptManagement
|
2025-03-19 16:38:59 -07:00 |
|
Ishaan Jaff
|
98c5ffdb20
|
get_custom_loggers_for_type
|
2025-03-19 16:36:41 -07:00 |
|
Ishaan Jaff
|
afd7d066bf
|
fix code quality checks
|
2025-03-18 22:34:43 -07:00 |
|
Ishaan Jaff
|
a27782e657
|
_arize_otel_logger
|
2025-03-18 22:19:51 -07:00 |
|
Ishaan Jaff
|
43eedf26f7
|
Merge branch 'main' into litellm_arize_dynamic_logging
|
2025-03-18 22:13:35 -07:00 |
|
Krrish Dholakia
|
c8eaf8e419
|
fix(response_metadata.py): log the litellm_model_name
make it easier to track the model sent to the provider
|
2025-03-18 17:46:33 -07:00 |
|
Krrish Dholakia
|
d29108bfd5
|
fix(litellm_logging.py): fix update
|
2025-03-18 17:18:58 -07:00 |
|
Ishaan Jaff
|
063db7ba91
|
fix arize logging
|
2025-03-18 15:51:58 -07:00 |
|
Krrish Dholakia
|
f0da181215
|
fix(get_llm_provider.py): Fixes https://github.com/BerriAI/litellm/issues/9291
|
2025-03-18 10:48:28 -07:00 |
|
Krish Dholakia
|
8ee4eead7f
|
Merge pull request #9333 from BerriAI/litellm_dev_03_17_2025_p2
fix(ollama/completions/transformation.py): pass prompt, untemplated o…
|
2025-03-17 21:48:30 -07:00 |
|
Krrish Dholakia
|
4d56992407
|
fix(ollama/completions/transformation.py): pass prompt, untemplated on /completions request
Fixes https://github.com/BerriAI/litellm/issues/6900
|
2025-03-17 18:35:44 -07:00 |
|
Krrish Dholakia
|
301375bf84
|
fix(streaming_handler.py): support logging complete streaming response on cache hit
|
2025-03-17 18:10:39 -07:00 |
|
Krrish Dholakia
|
ba6369e359
|
fix(streaming_handler.py): emit deep copy of completed chunk
|
2025-03-17 17:26:21 -07:00 |
|
Krrish Dholakia
|
01fc7f4175
|
fix(logging_utils.py): revert change
|
2025-03-16 21:04:41 -07:00 |
|
Krrish Dholakia
|
85cf702deb
|
fix(streaming_handler.py): raise stop iteration post-finish reason
|
2025-03-16 20:40:41 -07:00 |
|
Krrish Dholakia
|
08e73d66a1
|
fix(litellm_logging.py): remove unused import
|
2025-03-16 20:24:27 -07:00 |
|
Krrish Dholakia
|
7446038d26
|
fix(streaming_handler.py): pass complete streaming response on completion
|
2025-03-16 20:22:12 -07:00 |
|
Krrish Dholakia
|
4d3f4b31d1
|
fix(streaming_handler.py): return model response on finished chunk
|
2025-03-16 13:05:46 -07:00 |
|
Krrish Dholakia
|
82320a9b65
|
refactor(litellm_logging.py): delegate returning a complete response to the streaming_handler
Removes incorrect logic for calculating complete streaming response from litellm logging
|
2025-03-15 09:55:33 -07:00 |
|
Krrish Dholakia
|
424f51cc06
|
fix(utils.py): Prevents final chunk w/ usage from being ignored
Fixes https://github.com/BerriAI/litellm/issues/7112
|
2025-03-15 09:12:14 -07:00 |
|
Krrish Dholakia
|
d818530265
|
fix(factory.py): reduce ollama pt LOC < 50
|
2025-03-14 21:10:05 -07:00 |
|
Krish Dholakia
|
7b189e3085
|
Merge pull request #9261 from briandevvn/fix_ollama_pt
Fix "system" role has become unacceptable in ollama
|
2025-03-14 20:13:28 -07:00 |
|
Krrish Dholakia
|
b75cd3b887
|
feat(endpoints.py): support adding credentials by model id
Allows user to reuse existing model credentials
|
2025-03-14 12:32:32 -07:00 |
|
Krrish Dholakia
|
913dc5b73b
|
feat(endpoints.py): enable retrieving existing credentials by model name
Enables reusing existing credentials
|
2025-03-14 12:02:50 -07:00 |
|
Brian Dev
|
f53e365170
|
Support 'system' role ollama
|
2025-03-15 00:55:18 +07:00 |
|
Ishaan Jaff
|
ceb8668e4a
|
Merge pull request #9220 from BerriAI/litellm_qa_responses_api
[Fixes] Responses API - allow /responses and subpaths as LLM API route + Add exception mapping for responses API
|
2025-03-13 21:36:59 -07:00 |
|
Ishaan Jaff
|
a6e04aeffb
|
exception_type
|
2025-03-13 20:09:32 -07:00 |
|
Sunny Wan
|
e01d12b878
|
Merge branch 'BerriAI:main' into main
|
2025-03-13 19:37:22 -04:00 |
|
Ishaan Jaff
|
c2ed7add37
|
Add exception mapping for responses API
|
2025-03-13 15:57:58 -07:00 |
|
Ishaan Jaff
|
acdc2d8266
|
fix exception_type
|
2025-03-13 15:33:17 -07:00 |
|
Krish Dholakia
|
72f92853e0
|
Merge branch 'main' into litellm_dev_03_12_2025_p1
|
2025-03-12 22:14:02 -07:00 |
|
Krrish Dholakia
|
d024a5d703
|
feat(credential_accessor.py): fix upserting new credentials via accessor
|
2025-03-12 19:03:37 -07:00 |
|
Krrish Dholakia
|
c76cf6ad6c
|
fix(azure_ai/transformation.py): support passing api version to azure ai services endpoint
Fixes https://github.com/BerriAI/litellm/issues/7275
|
2025-03-12 15:16:42 -07:00 |
|
Krish Dholakia
|
103b3cb574
|
Merge branch 'main' into litellm_dev_03_10_2025_p3
|
2025-03-12 14:56:01 -07:00 |
|
Ishaan Jaff
|
2a68c56bc0
|
working streaming logging + cost tracking
|
2025-03-12 07:27:53 -07:00 |
|
Ishaan Jaff
|
c1b9c4cc7b
|
_get_assembled_streaming_response
|
2025-03-12 07:21:03 -07:00 |
|
Ishaan Jaff
|
a3d1e39164
|
revert to older logging implementation
|
2025-03-12 07:14:36 -07:00 |
|
Ishaan Jaff
|
0dc5e784f5
|
working streaming logging
|
2025-03-12 00:02:39 -07:00 |
|
Ishaan Jaff
|
fddc1d4186
|
_transform_response_api_usage_to_chat_usage
|
2025-03-11 22:26:44 -07:00 |
|
Ishaan Jaff
|
d6ea064ebe
|
Response API cost tracking
|
2025-03-11 22:02:14 -07:00 |
|
Krrish Dholakia
|
934c06c207
|
test: fix tests
|
2025-03-11 17:42:36 -07:00 |
|
Krrish Dholakia
|
4f4507ccc0
|
refactor(azure.py): working azure client init on audio speech endpoint
|
2025-03-11 14:19:45 -07:00 |
|
Krrish Dholakia
|
e58c18611f
|
fix: fix linting error
|
2025-03-10 21:22:00 -07:00 |
|
Krrish Dholakia
|
a87f822c50
|
feat: working e2e credential management - support reusing existing credentials
|
2025-03-10 19:29:24 -07:00 |
|
Krrish Dholakia
|
e518e3558b
|
feat(credential_accessor.py): support loading in credentials from credential_list
Resolves https://github.com/BerriAI/litellm/issues/9114
|
2025-03-10 17:15:58 -07:00 |
|
Krrish Dholakia
|
5d1b5f94c6
|
feat(azure.py): add azure bad request error support
|
2025-03-10 15:59:06 -07:00 |
|
Krrish Dholakia
|
cc0606b38d
|
feat(openai.py): bubble all error information back to client
|
2025-03-10 15:27:43 -07:00 |
|
Krish Dholakia
|
b401f2c06f
|
Support openrouter reasoning_content on streaming (#9094)
* feat(convert_dict_to_response.py): support openrouter format of reasoning content
* fix(transformation.py): fix openrouter streaming with reasoning content
Fixes https://github.com/BerriAI/litellm/issues/8193#issuecomment-270892962
* fix: fix type error
|
2025-03-09 20:03:59 -07:00 |
|