Commit graph

829 commits

Author SHA1 Message Date
Krrish Dholakia
d57e57234e fix(openai.py): fix linting issue 2024-01-23 13:47:37 -08:00
Krrish Dholakia
e4fda7c840 feat(utils.py): emit response cost as part of logs 2024-01-23 13:47:37 -08:00
Krrish Dholakia
3234f19ad4 fix(utils.py): move from pkg_resources to importlib 2024-01-23 13:47:37 -08:00
Krrish Dholakia
36c6d3cd90 fix(utils.py): fix debug log 2024-01-22 15:15:34 -08:00
Krrish Dholakia
a343c4d22f refactor(utils.py): fix linting errors 2024-01-22 15:15:34 -08:00
Krrish Dholakia
074ea17325 fix: support streaming custom cost completion tracking 2024-01-22 15:15:34 -08:00
Krrish Dholakia
2ce4258cc0 fix(main.py): support custom pricing for embedding calls 2024-01-22 15:15:34 -08:00
Krrish Dholakia
276a685a59 feat(utils.py): support custom cost tracking per second
https://github.com/BerriAI/litellm/issues/1374
2024-01-22 15:15:34 -08:00
Krrish Dholakia
6c39b2855f fix(utils.py): fix async/sync streaming logging 2024-01-22 13:54:51 -08:00
Krrish Dholakia
2165dcf6fb fix(utils.py): fix callback logging 2024-01-21 00:56:30 -08:00
Krrish Dholakia
e2831e9c80 fix: fix proxy logging 2024-01-20 18:22:45 -08:00
Krrish Dholakia
09b7235b31 fix: support info level logging on pkg + proxy 2024-01-20 17:45:47 -08:00
Krrish Dholakia
b07677c6be fix(gemini.py): support streaming 2024-01-19 20:21:34 -08:00
Krrish Dholakia
f2a8ceddc2 fix(utils.py): revert exception mapping change 2024-01-19 17:39:35 -08:00
Krrish Dholakia
f05aba1f85 fix(utils.py): add metadata to logging obj on setup, if exists 2024-01-19 17:29:47 -08:00
ishaan-jaff
6a695477ba (fix) async langfuse logger 2024-01-19 10:44:51 -08:00
ishaan-jaff
f2cfb76920 (fix) use asyncio run_in_executor 2024-01-19 09:52:51 -08:00
ishaan-jaff
a9c5b02303 (v0) fix 2024-01-19 08:51:14 -08:00
ishaan-jaff
697c511e76 (feat) support user param for all providers 2024-01-18 17:45:59 -08:00
ishaan-jaff
debef7544d (feat) return Azure enahncements used 2024-01-17 18:46:41 -08:00
Krrish Dholakia
08b409bae8 fix(utils.py): fix if check 2024-01-17 17:17:58 -08:00
Krrish Dholakia
7ed4d9b4d1 fix(utils.py): allow dynamically setting boto3 init and switching between bedrock and openai 2024-01-17 15:56:30 -08:00
Krrish Dholakia
8e9dc09955 fix(bedrock.py): add support for sts based boto3 initialization
https://github.com/BerriAI/litellm/issues/1476
2024-01-17 12:08:59 -08:00
Krrish Dholakia
7b39aacadf fix(utils.py): mistral optional param mapping 2024-01-17 09:44:21 -08:00
ishaan-jaff
00ac18e8b7 (feat) improve bedrock, sagemaker exception mapping 2024-01-15 21:22:22 -08:00
ishaan-jaff
fcc1e23a05 (fix) post_call rules 2024-01-15 20:56:25 -08:00
ishaan-jaff
e864c78d15 (feat) post call rules - fail with error message 2024-01-15 17:13:13 -08:00
ishaan-jaff
79ad63009e (feat) support extra body for Azure, OpenAI 2024-01-13 14:32:11 -08:00
ishaan-jaff
6bae534968 (fix) check if custom_llm_provider is not None 2024-01-13 12:54:03 -08:00
ishaan-jaff
53fd62b0cd (feat) use custom_llm_provider in completion_cost 2024-01-13 12:29:51 -08:00
ishaan-jaff
6b2a4714a6 (feat) return custom_llm_provider in streaming response 2024-01-12 17:14:43 -08:00
David Leen
a674de8f36 improve bedrock exception granularity 2024-01-12 16:38:55 +01:00
Ishaan Jaff
d181bd22a7
Merge pull request #1422 from dleen/httpx
(fix) create httpx.Request instead of httpx.request
2024-01-11 22:31:55 +05:30
David Leen
6b87c13b9d (fix) create httpx.Request instead of httpx.request
fixes #1420
2024-01-11 16:22:26 +01:00
ishaan-jaff
1fb3547e48 (feat) improve litellm verbose logs 2024-01-11 18:13:08 +05:30
ishaan-jaff
f297a4d174 (feat) show args passed to litellm.completion, acompletion on call 2024-01-11 17:56:27 +05:30
Ishaan Jaff
2433d6c613
Merge pull request #1200 from MateoCamara/explicit-args-acomplete
feat: added explicit args to acomplete
2024-01-11 10:39:05 +05:30
ishaan-jaff
f61d8596e1 (fix) working s3 logging 2024-01-11 08:57:32 +05:30
Krrish Dholakia
3080f27b54 fix(utils.py): raise correct error for azure content blocked error 2024-01-10 23:31:51 +05:30
Mateo Cámara
203089e6c7
Merge branch 'main' into explicit-args-acomplete 2024-01-09 13:07:37 +01:00
Ishaan Jaff
4cfa010dbd
Merge pull request #1381 from BerriAI/litellm_content_policy_violation_exception
[Feat] Add litellm.ContentPolicyViolationError
2024-01-09 17:18:29 +05:30
ishaan-jaff
248e5f3d92 (chore) remove deprecated completion_with_config() tests 2024-01-09 17:13:06 +05:30
ishaan-jaff
186fc4614d (feat) add ContentPolicyViolationError for azure 2024-01-09 16:58:09 +05:30
ishaan-jaff
9da61bdf31 (fix) ContentPolicyViolationError 2024-01-09 16:53:15 +05:30
Mateo Cámara
bb06c51ede Added test to check if acompletion is using the same parameters as CompletionRequest attributes. Added functools to client decorator to expose acompletion parameters from outside. 2024-01-09 12:06:49 +01:00
ishaan-jaff
09874cc83f (v0) add ContentPolicyViolationError 2024-01-09 16:33:03 +05:30
ishaan-jaff
5f2cbfc711 (feat) litellm.completion - support ollama timeout 2024-01-09 10:34:41 +05:30
Krrish Dholakia
dd78782133 fix(utils.py): error handling for litellm --model mistral edge case 2024-01-08 15:09:01 +05:30
Krrish Dholakia
6333fbfe56 fix(main.py): support cost calculation for text completion streaming object 2024-01-08 12:41:43 +05:30
Krrish Dholakia
9b46412279 fix(utils.py): fix logging for text completion streaming 2024-01-08 12:05:28 +05:30