Commit graph

1001 commits

Author SHA1 Message Date
Vince Loewe
91f77fa7a6 fix timestamps and user 2024-02-27 22:30:32 -08:00
Vince Loewe
c8a0a38c2a feat: LLMonitor is now Lunary 2024-02-27 22:07:13 -08:00
Krrish Dholakia
6a6a334788 fix(utils.py): map optional params for gemini pro vision 2024-02-27 14:45:53 -08:00
Ishaan Jaff
66aead6073 Merge pull request #2216 from BerriAI/litellm_fix_using_mistral_azure_ai
[FIX] using mistral on azure ai studio
2024-02-27 08:37:20 -08:00
ishaan-jaff
7eb9799bba (fix) support mistral on azure ai studio 2024-02-27 06:48:09 -08:00
zu1k
21e4a8e94d fix(utils.py): fix compatibility between together_ai and openai-python 2024-02-27 16:38:45 +08:00
Krish Dholakia
f76c98c267 Merge pull request #2208 from BerriAI/litellm_enforce_team_limits
Litellm enforce team limits
2024-02-26 23:10:01 -08:00
Krrish Dholakia
5c28e2091f fix(utils.py): fix redis cache test 2024-02-26 22:04:24 -08:00
Krish Dholakia
b0f96411f5 Merge pull request #2203 from BerriAI/litellm_streaming_caching_fix
fix(utils.py): support returning caching streaming response for function calling streaming calls
2024-02-26 19:58:00 -08:00
Krrish Dholakia
de80b3fec7 fix(utils.py): fixing sync streaming for caching 2024-02-26 19:32:30 -08:00
Krrish Dholakia
4ba18f9932 fix(utils.py): fix streaming logic 2024-02-26 14:26:58 -08:00
Krrish Dholakia
4e608c86c1 fix(utils.py): fix streaming 2024-02-26 12:52:53 -08:00
Krrish Dholakia
dcca55159b fix(utils.py): support returning caching streaming response for function calling streaming calls 2024-02-26 12:31:00 -08:00
Krrish Dholakia
a78b01158e fix(utils.py): fix vertex ai finish reason handling 2024-02-26 08:59:55 -08:00
Ishaan Jaff
859f8fe32c Merge pull request #2187 from BerriAI/litellm_clickhouse_logs
[FEAT] Use Logging on clickhouse
2024-02-26 08:26:02 -08:00
ishaan-jaff
4e8b62ccea (feat) use clickhouse as a logging provider 2024-02-26 07:35:16 -08:00
Krish Dholakia
686770376e Merge branch 'main' into litellm_block_unblock_user_api 2024-02-24 11:43:16 -08:00
Krish Dholakia
5461e2fee8 Merge pull request #2175 from BerriAI/litellm_stricter_function_calling_streaming_tests
fix(utils.py): stricter azure function calling tests
2024-02-23 22:58:07 -08:00
Krish Dholakia
7657a5bd1b Merge pull request #2165 from BerriAI/litellm_langfuse_ttfs_logging
fix(langfuse.py): support time to first token logging on langfuse
2024-02-23 21:50:03 -08:00
Krrish Dholakia
74b97b23e9 fix(utils.py): stricter azure function calling tests 2024-02-23 20:55:32 -08:00
Krrish Dholakia
483099fb5c fix(utils.py): add more logging statements 2024-02-23 15:17:22 -08:00
ishaan-jaff
24fb50ff29 (feat) add groq ai 2024-02-23 10:42:51 -08:00
Krrish Dholakia
78f8b4836b fix(langfuse.py): support time to first token logging on langfuse 2024-02-23 10:04:12 -08:00
Ishaan Jaff
6155a5b5e8 Merge pull request #2164 from BerriAI/litellm_support_extra_headers
[FEAT] Support extra headers - OpenAI / Azure
2024-02-23 09:42:09 -08:00
Ishaan Jaff
71820378aa Merge pull request #2163 from vivek-athina/feature/athina
Support for Athina logging
2024-02-23 09:39:06 -08:00
ishaan-jaff
de8283dac4 (feat) support extra_headers 2024-02-23 08:48:21 -08:00
Vivek Aditya
0315be6db6 Support for Athina logging 2024-02-23 21:49:40 +05:30
Krish Dholakia
858c21c96d Merge branch 'main' into litellm_fix_azure_function_calling_streaming 2024-02-22 22:36:38 -08:00
Krish Dholakia
14c05bdea6 Merge branch 'main' into litellm_banned_keywords_list 2024-02-22 22:20:59 -08:00
Krrish Dholakia
5775526453 test(test_streaming.py): add exception mapping for palm timeout error 2024-02-22 21:36:57 -08:00
Krrish Dholakia
2e5a5f82a3 fix(vertex_ai.py): fix vertex ai function calling 2024-02-22 21:28:24 -08:00
Krrish Dholakia
3b156888a6 fix(utils.py): ensure argument is always a string 2024-02-22 15:15:56 -08:00
Krrish Dholakia
a378edf0c9 fix(utils.py): add exception mapping for gemini 2024-02-21 21:31:26 -08:00
Krrish Dholakia
8f8ac9d94e fix(utils.py): only return cached streaming object for streaming calls 2024-02-21 21:27:40 -08:00
Krrish Dholakia
654f4055e0 fix(utils.py): add palm exception mapping for 500 internal server error 2024-02-21 21:18:03 -08:00
Krrish Dholakia
ae5742ba11 fix(utils.py): enable streaming cache logging 2024-02-21 21:10:58 -08:00
Krrish Dholakia
2abefd03c8 fix(utils.py): support streaming cached response logging 2024-02-21 17:53:14 -08:00
Krish Dholakia
d1cd3073c3 Merge pull request #1969 from kan-bayashi/fix/support-multiple-tools-in-gemini
fix: fix the issues when using tools in gemini
2024-02-21 11:46:26 -08:00
Krrish Dholakia
7089b13632 fix(llm_guard.py): add streaming hook for moderation calls 2024-02-20 20:31:32 -08:00
Krrish Dholakia
5b6b1f120f fix(utils.py): fix print statement 2024-02-19 23:00:41 -08:00
Krrish Dholakia
11c12e7381 fix(gemini.py): fix async streaming + add native async completions 2024-02-19 22:41:36 -08:00
Krrish Dholakia
423bd4012d refactor: rename enterprise specific hooks and callbacks to be more precise 2024-02-17 17:52:45 -08:00
Krish Dholakia
8db15adfa6 Merge pull request #2017 from sorokine/main 2024-02-17 07:07:56 -08:00
Krrish Dholakia
67cd9b1c63 feat(llama_guard.py): add llama guard support for content moderation + new async_moderation_hook endpoint 2024-02-16 18:45:25 -08:00
Krrish Dholakia
2a5a14d612 fix(utils.py): support image gen logging to langfuse 2024-02-16 16:12:52 -08:00
Alexandre Sorokine
5684b232ec fix for importllib compatibility issue for python 3.8
was tested for python versions 3.8, 3.9, 3.10, 3.11, 3.12
2024-02-16 15:52:39 -05:00
Krish Dholakia
3c0fc3ea3c Merge branch 'main' into litellm_moderations_improvements 2024-02-15 23:08:25 -08:00
Krish Dholakia
233aff5ba9 Merge pull request #1916 from RenaLu/main
Add support for Vertex AI custom models deployed on private endpoint
2024-02-15 22:47:36 -08:00
Krrish Dholakia
6943358cf7 fix(utils.py): add more exception mapping for huggingface 2024-02-15 21:26:22 -08:00
Krish Dholakia
1d04fb3cd6 Merge pull request #1989 from BerriAI/litellm_redis_url_fix
fix(redis.py): fix instantiating redis client from url
2024-02-15 21:23:17 -08:00