Commit graph

971 commits

Author SHA1 Message Date
Vivek Aditya
244fa1c4ab Support for Athina logging 2024-02-23 21:49:40 +05:30
Krrish Dholakia
9d17a0789f fix(vertex_ai.py): fix vertex ai function calling 2024-02-22 21:28:24 -08:00
Krrish Dholakia
6ba1a5f6b2 fix(utils.py): add exception mapping for gemini 2024-02-21 21:31:26 -08:00
Krrish Dholakia
fb2ae3a032 fix(utils.py): only return cached streaming object for streaming calls 2024-02-21 21:27:40 -08:00
Krrish Dholakia
f1742769a2 fix(utils.py): add palm exception mapping for 500 internal server error 2024-02-21 21:18:03 -08:00
Krrish Dholakia
2d62dee712 fix(utils.py): enable streaming cache logging 2024-02-21 21:10:58 -08:00
Krrish Dholakia
f8b233b653 fix(utils.py): support streaming cached response logging 2024-02-21 17:53:14 -08:00
Krish Dholakia
851473b71a
Merge pull request #1969 from kan-bayashi/fix/support-multiple-tools-in-gemini
fix: fix the issues when using tools in gemini
2024-02-21 11:46:26 -08:00
Krrish Dholakia
49847347d0 fix(llm_guard.py): add streaming hook for moderation calls 2024-02-20 20:31:32 -08:00
Krrish Dholakia
7b641491a2 fix(utils.py): fix print statement 2024-02-19 23:00:41 -08:00
Krrish Dholakia
45eb4a5fcc fix(gemini.py): fix async streaming + add native async completions 2024-02-19 22:41:36 -08:00
Krrish Dholakia
73acdf3736 refactor: rename enterprise specific hooks and callbacks to be more precise 2024-02-17 17:52:45 -08:00
Krish Dholakia
9ce29295af
Merge pull request #2017 from sorokine/main 2024-02-17 07:07:56 -08:00
Krrish Dholakia
2a4a6995ac feat(llama_guard.py): add llama guard support for content moderation + new async_moderation_hook endpoint 2024-02-16 18:45:25 -08:00
Krrish Dholakia
f57483ea70 fix(utils.py): support image gen logging to langfuse 2024-02-16 16:12:52 -08:00
Alexandre Sorokine
3c919cbbe7 fix for importllib compatibility issue for python 3.8
was tested for python versions 3.8, 3.9, 3.10, 3.11, 3.12
2024-02-16 15:52:39 -05:00
Krish Dholakia
999fab82f7
Merge branch 'main' into litellm_moderations_improvements 2024-02-15 23:08:25 -08:00
Krish Dholakia
9b60ef9a3c
Merge pull request #1916 from RenaLu/main
Add support for Vertex AI custom models deployed on private endpoint
2024-02-15 22:47:36 -08:00
Krrish Dholakia
c37aad50ea fix(utils.py): add more exception mapping for huggingface 2024-02-15 21:26:22 -08:00
Krish Dholakia
1e238614c8
Merge pull request #1989 from BerriAI/litellm_redis_url_fix
fix(redis.py): fix instantiating redis client from url
2024-02-15 21:23:17 -08:00
Krrish Dholakia
eb45df16f1 fix(test_streaming.py): handle hf tgi zephyr not loading for streaming issue 2024-02-15 19:24:02 -08:00
ishaan-jaff
daa61cfdb6 (fix) merge conflicts 2024-02-15 18:34:53 -08:00
Ishaan Jaff
41302ed903
Merge branch 'main' into litellm_allow_users_use_generic_api_logger 2024-02-15 18:27:25 -08:00
ishaan-jaff
ec16b536a1 (fix) merge conflict 2024-02-15 18:25:19 -08:00
Krrish Dholakia
aa78f55dff fix(utils.py): add try-except around the tiktoken local import 2024-02-15 17:57:31 -08:00
ishaan-jaff
56fba95b4a (fix) importing enterprise features 2024-02-15 17:24:27 -08:00
ishaan-jaff
07afefea34 (chore) debug sys path docker error 2024-02-15 17:23:07 -08:00
Krrish Dholakia
3a5d40f4b9 fix(utils.py): fix local tiktoken package import 2024-02-15 16:18:54 -08:00
ishaan-jaff
47b8715d25 (feat) fix api callback imports 2024-02-15 16:15:36 -08:00
ishaan-jaff
4e8a94b916 (feat) log with generic logger 2024-02-15 13:43:16 -08:00
Krish Dholakia
78bff46d4a
Merge pull request #1947 from sorokine/main
Update utils.py: switched from pkg_resources to importlib.resources for `filename` stored in the `os.environ(TIKTOKEN_CACHE_DIR)`
2024-02-14 22:02:35 -08:00
Krish Dholakia
57654f4533
Merge branch 'main' into litellm_aioboto3_sagemaker 2024-02-14 21:46:58 -08:00
Rena Lu
9c45bfdb70 handle optional params 2024-02-14 23:27:16 +00:00
ishaan-jaff
693efc8e84 (feat) add moderation on router 2024-02-14 11:00:09 -08:00
kan-bayashi
a4e33c8c67 fix the issue when using multiple tools in gemini 2024-02-14 13:09:35 +09:00
Krrish Dholakia
afad40b4f8 fix(utils.py): fix streaming rule calling 2024-02-12 22:36:32 -08:00
Krrish Dholakia
7600c8f41d feat(utils.py): enable post call rules for streaming 2024-02-12 22:08:04 -08:00
Krrish Dholakia
b1bc30ee16 feat(sagemaker.py): aioboto3 streaming support 2024-02-12 21:18:34 -08:00
Alexandre Sorokine
36360c62ac Update utils.py: switched from pkg_resources to importlib.resources for filename stored in the os/environ(TIKTOKEN_CACHE_DIR) 2024-02-12 16:28:38 -05:00
ishaan-jaff
e36f1af389 (feat) fix litellm verbose logger 2024-02-09 19:01:21 -08:00
ishaan-jaff
6dc7ded1a6 (bedrock) raise timeout error 2024-02-09 14:37:34 -08:00
Krrish Dholakia
b9393fb769 fix(test_parallel_request_limiter.py): use mock responses for streaming 2024-02-08 21:45:38 -08:00
Ishaan Jaff
59465bd612
Merge pull request #1892 from BerriAI/litellm_speed_up_s3_logging
[FEAT] 76 % Faster s3 logging Proxy / litellm.acompletion / router.acompletion 🚀
2024-02-08 11:48:36 -08:00
ishaan-jaff
ac4d9a7542 (feat) speed up s3 logging 2024-02-08 10:59:54 -08:00
ishaan-jaff
9a23317461 (feat) fix s3 cache 2024-02-08 10:01:56 -08:00
ishaan-jaff
6969b25946 (fix) azure cost calc 2024-02-07 17:33:10 -08:00
ishaan-jaff
bb469278c6 (fix) cost tracking 2024-02-07 17:06:05 -08:00
ishaan-jaff
0764af4392 (feat) use base_model for azure response_cost 2024-02-07 16:33:07 -08:00
Krrish Dholakia
655fcd4d79 fix(utils.py): fix ollama stop sequence mapping 2024-02-07 13:14:03 -08:00
ishaan-jaff
ef32a5da1b (feat) redis-semantic cache 2024-02-06 13:30:55 -08:00