Commit graph

1046 commits

Author SHA1 Message Date
Krrish Dholakia
e04a4a7439 fix(utils.py): fix content policy violation check for streaming 2024-01-23 06:55:04 -08:00
Ishaan Jaff
97dd61a6cb Merge pull request #1561 from BerriAI/litellm_sagemaker_streaming
[Feat] Add REAL Sagemaker streaming
2024-01-22 22:10:20 -08:00
ishaan-jaff
09dd1ed68b v0 sagemaker_stream 2024-01-22 21:53:16 -08:00
Krrish Dholakia
29fe97b6a9 fix(router.py): fix order of dereferenced dictionaries 2024-01-22 21:42:25 -08:00
Krish Dholakia
3eaae0e73c Merge pull request #1557 from BerriAI/litellm_emit_spend_logs
feat(utils.py): emit response cost as part of logs
2024-01-22 21:02:40 -08:00
Krrish Dholakia
579dfc3013 test: fix tests 2024-01-22 20:20:17 -08:00
Krrish Dholakia
db2b7bfd4e fix(openai.py): fix linting issue 2024-01-22 18:20:15 -08:00
Krish Dholakia
8647f2a665 Merge pull request #1556 from BerriAI/litellm_importlib_issue
fix(utils.py): move from pkg_resources to importlib
2024-01-22 15:56:07 -08:00
Krrish Dholakia
2ea18785ca feat(utils.py): emit response cost as part of logs 2024-01-22 15:53:04 -08:00
Krrish Dholakia
737a5a7b38 fix(utils.py): fix debug log 2024-01-22 15:15:34 -08:00
Krrish Dholakia
70b0d0307c refactor(utils.py): fix linting errors 2024-01-22 15:15:34 -08:00
Krrish Dholakia
e423aeff85 fix: support streaming custom cost completion tracking 2024-01-22 15:15:34 -08:00
Krrish Dholakia
85b9ad7def fix(main.py): support custom pricing for embedding calls 2024-01-22 15:15:34 -08:00
Krrish Dholakia
480c3d3991 feat(utils.py): support custom cost tracking per second
https://github.com/BerriAI/litellm/issues/1374
2024-01-22 15:15:34 -08:00
Krrish Dholakia
78308ddf91 fix(utils.py): move from pkg_resources to importlib 2024-01-22 15:05:09 -08:00
Krrish Dholakia
b55dd5aa57 fix(utils.py): fix async/sync streaming logging 2024-01-22 13:54:51 -08:00
Krrish Dholakia
f7b7dd0b6f fix(utils.py): fix callback logging 2024-01-21 00:56:30 -08:00
Krrish Dholakia
59483da18b fix: fix proxy logging 2024-01-20 18:22:45 -08:00
Krrish Dholakia
2acdcc6671 fix: support info level logging on pkg + proxy 2024-01-20 17:45:47 -08:00
Krrish Dholakia
2ecc2f12cd fix(gemini.py): support streaming 2024-01-19 20:21:34 -08:00
Krrish Dholakia
b726b04309 fix(utils.py): revert exception mapping change 2024-01-19 17:39:35 -08:00
Krrish Dholakia
e957f41ab7 fix(utils.py): add metadata to logging obj on setup, if exists 2024-01-19 17:29:47 -08:00
ishaan-jaff
1d60edad57 (fix) async langfuse logger 2024-01-19 10:44:51 -08:00
ishaan-jaff
af808d4927 (fix) use asyncio run_in_executor 2024-01-19 09:52:51 -08:00
ishaan-jaff
8a7b01bfa2 (v0) fix 2024-01-19 08:51:14 -08:00
ishaan-jaff
d8d1cea69f (feat) support user param for all providers 2024-01-18 17:45:59 -08:00
ishaan-jaff
054d3a549e (feat) return Azure enahncements used 2024-01-17 18:46:41 -08:00
Krrish Dholakia
d4404fb61e fix(utils.py): fix if check 2024-01-17 17:17:58 -08:00
Krrish Dholakia
01a5e80df5 fix(utils.py): allow dynamically setting boto3 init and switching between bedrock and openai 2024-01-17 15:56:30 -08:00
Krrish Dholakia
cc89aa7456 fix(bedrock.py): add support for sts based boto3 initialization
https://github.com/BerriAI/litellm/issues/1476
2024-01-17 12:08:59 -08:00
Krrish Dholakia
2180977acf fix(utils.py): mistral optional param mapping 2024-01-17 09:44:21 -08:00
ishaan-jaff
83861730b5 (feat) improve bedrock, sagemaker exception mapping 2024-01-15 21:22:22 -08:00
ishaan-jaff
874358c398 (fix) post_call rules 2024-01-15 20:56:25 -08:00
ishaan-jaff
b1c93fdc52 (feat) post call rules - fail with error message 2024-01-15 17:13:13 -08:00
ishaan-jaff
823a932377 (feat) support extra body for Azure, OpenAI 2024-01-13 14:32:11 -08:00
ishaan-jaff
e67ac90796 (fix) check if custom_llm_provider is not None 2024-01-13 12:54:03 -08:00
ishaan-jaff
8e6d170da9 (feat) use custom_llm_provider in completion_cost 2024-01-13 12:29:51 -08:00
Krrish Dholakia
f08bb7e41f fix(utils.py): exclude s3 caching from individual item caching for embedding list
can't bulk upload to s3, so this will slow down calls

https://github.com/BerriAI/litellm/pull/1417
2024-01-13 16:19:30 +05:30
Krrish Dholakia
79cc739b53 fix(caching.py): fix async in-memory caching 2024-01-13 15:33:57 +05:30
Krrish Dholakia
cdadac1649 fix(caching.py): return updated kwargs from get_cache helper function 2024-01-13 15:04:34 +05:30
Krrish Dholakia
880f829013 fix(caching.py): use bulk writes and blockconnectionpooling for reads from Redis 2024-01-13 11:50:50 +05:30
ishaan-jaff
69552146f9 (feat) return custom_llm_provider in streaming response 2024-01-12 17:14:43 -08:00
Krrish Dholakia
813fb19620 fix: support async redis caching 2024-01-12 21:46:41 +05:30
David Leen
1660e4ab72 improve bedrock exception granularity 2024-01-12 16:38:55 +01:00
Krish Dholakia
7ecfc09221 Merge branch 'main' into litellm_embedding_caching_updates 2024-01-11 23:58:51 +05:30
Ishaan Jaff
8b0f93dd2d Merge pull request #1422 from dleen/httpx
(fix) create httpx.Request instead of httpx.request
2024-01-11 22:31:55 +05:30
David Leen
3cd5ef9a02 (fix) create httpx.Request instead of httpx.request
fixes #1420
2024-01-11 16:22:26 +01:00
Krrish Dholakia
107ba5aece fix(utils.py): use preset cache key for async calls as well 2024-01-11 19:30:28 +05:30
Krrish Dholakia
4ed9f2b712 fix(utils.py): bug fixes 2024-01-11 19:02:17 +05:30
Krrish Dholakia
63298d6c75 fix(utils.py): bug fixes 2024-01-11 18:14:22 +05:30