Commit graph

11876 commits

Author SHA1 Message Date
Ishaan Jaff
5d17c814a3 router - use retry policy 2024-05-04 17:04:51 -07:00
Ishaan Jaff
6d1981fbaa init router retry policy 2024-05-04 16:59:14 -07:00
Krish Dholakia
601e8a1172
Merge pull request #3448 from BerriAI/litellm_anthropic_fix
fix(factory.py): support 'function' openai message role for anthropic
2024-05-04 16:36:05 -07:00
Ishaan Jaff
85b2137f9c fix - test exceptions vertex ai 2024-05-04 16:09:20 -07:00
Andras Toth
d99555075f
fix(integrations): OpenMeter sync logger 2024-05-04 22:47:20 +02:00
Ishaan Jaff
7150df653f test azure exceptions are more decriptive 2024-05-04 13:02:29 -07:00
Ishaan Jaff
d45328dda6
Merge pull request #3450 from BerriAI/revert-3397-main
Revert "Add return_exceptions to litellm.batch_completion"
2024-05-04 13:01:25 -07:00
Ishaan Jaff
df8e33739d
Revert "Add return_exceptions to litellm.batch_completion" 2024-05-04 13:01:17 -07:00
Ishaan Jaff
d968dedd77
Merge pull request #1530 from TanaroSch/main
change max_tokens type to int
2024-05-04 12:47:15 -07:00
Krish Dholakia
a3905bad94
Merge pull request #3447 from BerriAI/litellm_fix_redis_ping
fix(caching.py): fix redis caching ping check
2024-05-04 12:46:31 -07:00
Krrish Dholakia
09d7121af2 fix(bedrock.py): map finish reason for bedrock 2024-05-04 12:45:40 -07:00
Ishaan Jaff
855c7caf0b fix add get_first_chars_messages in utils 2024-05-04 12:43:09 -07:00
Ishaan Jaff
7094ac9557
Merge pull request #3397 from ffreemt/main
Add return_exceptions to litellm.batch_completion
2024-05-04 12:41:21 -07:00
Krrish Dholakia
8d49b3a84c fix(factory.py): support openai 'functions' messages 2024-05-04 12:33:39 -07:00
Ishaan Jaff
faa8b77325
Merge pull request #3449 from BerriAI/litellm_return_messages_exceptions
[Feat] Add Exception mapping for Azure ContentPolicyViolationError
2024-05-04 11:55:19 -07:00
Ishaan Jaff
76825e1d2c test - mapping content policy violation errors 2024-05-04 11:15:34 -07:00
Ishaan Jaff
c6beaf53af litellm map Azure GPT ContentPolicyViolationError 2024-05-04 11:14:47 -07:00
Ishaan Jaff
59dac1bc7a ui - new build 2024-05-04 10:55:39 -07:00
Krrish Dholakia
d9d5149aa1 fix(factory.py): support mapping openai 'tool' message to anthropic format 2024-05-04 10:14:52 -07:00
Krrish Dholakia
33472bfd2b fix(factory.py): support 'function' openai message role for anthropic
Fixes https://github.com/BerriAI/litellm/issues/3446
2024-05-04 10:03:30 -07:00
Ishaan Jaff
0bc6969e4d
Merge pull request #3408 from msabramo/test_proxy_exception_mapping_mock_improvements
Improve mocking in `test_proxy_exception_mapping.py`
2024-05-04 09:29:41 -07:00
Krrish Dholakia
5a79f648c6 fix(caching.py): fix redis caching ping check
don't fail to startup. Log an error message.
2024-05-04 08:48:53 -07:00
Krish Dholakia
e7149a6812
Merge pull request #3445 from paneru-rajan/Issue-1159-imrove-reaceloop-doc
Improve the document of Traceloop
2024-05-04 08:35:56 -07:00
alisalim17
416d459a77 enable exception logging in logfire_openai.error() by setting _exc_info to True 2024-05-04 19:33:38 +04:00
alisalim17
7e0b479a37 docs: add documentation for logfire integration 2024-05-04 17:47:54 +04:00
alisalim17
978912ef32 feat: add failure handler for logfire 2024-05-04 17:40:23 +04:00
Rajan Paneru
7d9377f18a Improve the document of Traceloop
If we follow exact step I was getting two errors, which this Pull-Request will fix:
* Fixed the yaml snippet spacing issue: it was throwing error when running copy+paste code
* Added api_key: my-fake-key as to run litellm --config config.yaml --debug it will be required, otherwise it will fail

Need for Improvements:
* The traceloop is ambigous, because it's a company that maintains OpenLLMetry, and has observability solution and the sdk name is aslo traceloop-sdk
* The doc was missing several other observability solutions
* The steps were not quite obvious, added one more steps to make things clear
2024-05-04 22:42:53 +09:30
alisalim17
134611a8f4 test: add tests for logfire 2024-05-04 16:23:06 +04:00
alisalim17
39099e9c5b feat: add logfire integration 2024-05-04 16:22:53 +04:00
Lunik
1639a51f24
🔊 fix: Correctly use verbose logging
Signed-off-by: Lunik <lunik@tiwabbit.fr>
2024-05-04 11:04:23 +02:00
Lunik
8783fd4895
feat: Use 8 severity levels for azure content safety
Signed-off-by: Lunik <lunik@tiwabbit.fr>
2024-05-04 10:45:39 +02:00
Lunik
ebbeb333c6
✏️ doc: typo in azure content safety
Signed-off-by: Lunik <lunik@tiwabbit.fr>
2024-05-04 10:45:15 +02:00
Lunik
cb178723ca
📝 doc: Azure content safety Proxy usage
Signed-off-by: Lunik <lunik@tiwabbit.fr>
2024-05-04 10:39:43 +02:00
Ishaan Jaff
8e9ba0f27f
Merge pull request #3441 from BerriAI/ui_select_start_end_time
UI select start end time
2024-05-03 22:14:58 -07:00
Ishaan Jaff
2cfb37aaaf add text about model group / time range 2024-05-03 22:14:12 -07:00
Ishaan Jaff
1d9a96bf8f ui - set models and start/endtimes 2024-05-03 22:10:40 -07:00
Ishaan Jaff
19aa43798c ui - select model group 2024-05-03 22:09:22 -07:00
Ishaan Jaff
e7adbb3801 ui - model dashboard 2024-05-03 22:09:01 -07:00
Ishaan Jaff
d4afe8a3bc ui - view logs by model group and time 2024-05-03 21:59:22 -07:00
Krrish Dholakia
b7ca9a53c9 refactor(main.py): trigger new build 2024-05-03 21:53:51 -07:00
Ishaan Jaff
611e7bd403 ui - select model group and date time range 2024-05-03 21:51:47 -07:00
Krrish Dholakia
fa47ce456d bump: version 1.35.37 → 1.35.38 2024-05-03 21:38:32 -07:00
Krrish Dholakia
8249c986bf fix(main.py): support new 'supports_system_message=False' param
Fixes https://github.com/BerriAI/litellm/issues/3325
2024-05-03 21:31:45 -07:00
Ishaan Jaff
ed714a9d6e ui select model and start and endtime 2024-05-03 21:21:41 -07:00
Ishaan Jaff
d806fe3513 startTime and endTime on UI 2024-05-03 21:21:28 -07:00
Ishaan Jaff
fccdb92c6b fix - select startTime and endTime on UI 2024-05-03 21:20:19 -07:00
Ishaan Jaff
f7150cdba2 ui - slect start and end date 2024-05-03 20:52:30 -07:00
Krrish Dholakia
a35ac050d2 test(test_router_fallbacks.py): bump test limits 2024-05-03 20:42:29 -07:00
Ishaan Jaff
01a11ccced ui - new build 2024-05-03 20:40:58 -07:00
Sebastián Estévez
fc0ced48c1
add_function_to_prompt bug fix
This blows up when there's no "functions" in the dictionary even when tools is present because the inner function executes regardless (does not short circuit).
2024-05-03 23:38:54 -04:00