Krrish Dholakia
|
f0a5e0ffe9
|
fix(bedrock.py): working image calls to claude 3
|
2024-03-04 18:12:47 -08:00 |
|
Krrish Dholakia
|
dad65ca602
|
fix(bedrock.py): support anthropic messages api on bedrock (claude-3)
|
2024-03-04 17:15:47 -08:00 |
|
Krrish Dholakia
|
4f5f6ec812
|
test(test_completion.py): add testing for anthropic vision calling
|
2024-03-04 13:34:49 -08:00 |
|
Krrish Dholakia
|
1e2154317c
|
feat(anthropic.py): adds tool calling support
|
2024-03-04 10:42:28 -08:00 |
|
Krrish Dholakia
|
a1ce24c5f0
|
fix(huggingface_restapi.py): fix huggingface streaming error raising
|
2024-03-04 09:32:41 -08:00 |
|
Ishaan Jaff
|
561e7ff453
|
Merge pull request #2315 from BerriAI/litellm_add_claude_3
[FEAT]- add claude 3
|
2024-03-04 09:23:13 -08:00 |
|
Ishaan Jaff
|
fc34999168
|
Merge pull request #2290 from ti3x/bedrock_mistral
Add support for Bedrock Mistral models
|
2024-03-04 08:42:47 -08:00 |
|
Krrish Dholakia
|
8c2ac9101e
|
fix(utils.py): fix num retries logic
|
2024-03-04 08:01:02 -08:00 |
|
ishaan-jaff
|
b3f738832c
|
(feat) streaming claude-3
|
2024-03-04 07:29:23 -08:00 |
|
ishaan-jaff
|
26eea94404
|
(feat) - add claude 3
|
2024-03-04 07:13:08 -08:00 |
|
Tim Xia
|
7a5602e8f4
|
update comments
|
2024-03-02 13:34:39 -05:00 |
|
Tim Xia
|
79a62564f3
|
map optional params
|
2024-03-02 13:25:04 -05:00 |
|
Krish Dholakia
|
da0ab536d3
|
Merge pull request #2292 from BerriAI/litellm_mistral_streaming_error
fix(utils.py): handle mistral streaming error
|
2024-03-02 07:48:14 -08:00 |
|
Mikhail Khludnev
|
2686d0368e
|
utils.validate_environment to handle OLLAMA_API_BASE env
|
2024-03-02 13:41:59 +03:00 |
|
Krrish Dholakia
|
f27fc59e77
|
fix(utils.py): handle mistral streaming error
|
2024-03-01 21:23:10 -08:00 |
|
Krish Dholakia
|
7299895236
|
Merge pull request #2281 from mkhludnev/fix#2260-2261
fix #2260 #2261
|
2024-03-01 13:23:01 -08:00 |
|
Mikhail Khludnev
|
06d9a4b831
|
fix #2260 #2261
|
2024-03-01 21:46:57 +03:00 |
|
Krrish Dholakia
|
a60ff95011
|
feat(proxy_server.py): add new team_member delete endpoint
|
2024-03-01 09:14:08 -08:00 |
|
ishaan-jaff
|
4d84063712
|
(chore) add mistral azure ai comments
|
2024-02-29 12:04:16 -08:00 |
|
ishaan-jaff
|
a43678e4e2
|
(test) fix mistral tests
|
2024-02-29 12:01:01 -08:00 |
|
ishaan-jaff
|
934c035be3
|
(feat) use mistral azure with env vars
|
2024-02-29 08:28:46 -08:00 |
|
ishaan-jaff
|
5d087116fe
|
(feat) mistral allow setting API base in env
|
2024-02-29 08:15:47 -08:00 |
|
ishaan-jaff
|
ec21b39548
|
(feat) helpers for supports_function_calling
|
2024-02-28 18:15:05 -08:00 |
|
ishaan-jaff
|
ea4e7f79cb
|
(feat) support mistral function calling
|
2024-02-28 18:15:05 -08:00 |
|
Krrish Dholakia
|
d3fd1822b9
|
fix(utils.py): fix palm exception mapping
|
2024-02-28 18:15:05 -08:00 |
|
ishaan-jaff
|
bb33c9230d
|
(fix) async logging race condition
|
2024-02-28 14:44:02 -08:00 |
|
ishaan-jaff
|
18343ba679
|
(feat) add mistral tool calling support
|
2024-02-28 11:48:20 -08:00 |
|
Krish Dholakia
|
232efb5a80
|
Merge branch 'main' into litellm_streaming_format_fix
|
2024-02-27 20:16:09 -08:00 |
|
Krrish Dholakia
|
c3b6138d92
|
fix(utils.py): fix streaming issue
|
2024-02-27 14:57:50 -08:00 |
|
Krrish Dholakia
|
6a6a334788
|
fix(utils.py): map optional params for gemini pro vision
|
2024-02-27 14:45:53 -08:00 |
|
Krrish Dholakia
|
00962c5381
|
fix(proxy_server.py): drop none values in streaming response
|
2024-02-27 14:37:29 -08:00 |
|
Ishaan Jaff
|
66aead6073
|
Merge pull request #2216 from BerriAI/litellm_fix_using_mistral_azure_ai
[FIX] using mistral on azure ai studio
|
2024-02-27 08:37:20 -08:00 |
|
ishaan-jaff
|
7eb9799bba
|
(fix) support mistral on azure ai studio
|
2024-02-27 06:48:09 -08:00 |
|
zu1k
|
21e4a8e94d
|
fix(utils.py): fix compatibility between together_ai and openai-python
|
2024-02-27 16:38:45 +08:00 |
|
Krish Dholakia
|
f76c98c267
|
Merge pull request #2208 from BerriAI/litellm_enforce_team_limits
Litellm enforce team limits
|
2024-02-26 23:10:01 -08:00 |
|
Krrish Dholakia
|
5c28e2091f
|
fix(utils.py): fix redis cache test
|
2024-02-26 22:04:24 -08:00 |
|
Krish Dholakia
|
b0f96411f5
|
Merge pull request #2203 from BerriAI/litellm_streaming_caching_fix
fix(utils.py): support returning caching streaming response for function calling streaming calls
|
2024-02-26 19:58:00 -08:00 |
|
Krrish Dholakia
|
de80b3fec7
|
fix(utils.py): fixing sync streaming for caching
|
2024-02-26 19:32:30 -08:00 |
|
Krrish Dholakia
|
4ba18f9932
|
fix(utils.py): fix streaming logic
|
2024-02-26 14:26:58 -08:00 |
|
Krrish Dholakia
|
4e608c86c1
|
fix(utils.py): fix streaming
|
2024-02-26 12:52:53 -08:00 |
|
Krrish Dholakia
|
dcca55159b
|
fix(utils.py): support returning caching streaming response for function calling streaming calls
|
2024-02-26 12:31:00 -08:00 |
|
Krrish Dholakia
|
a78b01158e
|
fix(utils.py): fix vertex ai finish reason handling
|
2024-02-26 08:59:55 -08:00 |
|
Ishaan Jaff
|
859f8fe32c
|
Merge pull request #2187 from BerriAI/litellm_clickhouse_logs
[FEAT] Use Logging on clickhouse
|
2024-02-26 08:26:02 -08:00 |
|
ishaan-jaff
|
4e8b62ccea
|
(feat) use clickhouse as a logging provider
|
2024-02-26 07:35:16 -08:00 |
|
Krish Dholakia
|
686770376e
|
Merge branch 'main' into litellm_block_unblock_user_api
|
2024-02-24 11:43:16 -08:00 |
|
Krish Dholakia
|
5461e2fee8
|
Merge pull request #2175 from BerriAI/litellm_stricter_function_calling_streaming_tests
fix(utils.py): stricter azure function calling tests
|
2024-02-23 22:58:07 -08:00 |
|
Krish Dholakia
|
7657a5bd1b
|
Merge pull request #2165 from BerriAI/litellm_langfuse_ttfs_logging
fix(langfuse.py): support time to first token logging on langfuse
|
2024-02-23 21:50:03 -08:00 |
|
Krrish Dholakia
|
74b97b23e9
|
fix(utils.py): stricter azure function calling tests
|
2024-02-23 20:55:32 -08:00 |
|
Krrish Dholakia
|
483099fb5c
|
fix(utils.py): add more logging statements
|
2024-02-23 15:17:22 -08:00 |
|
ishaan-jaff
|
24fb50ff29
|
(feat) add groq ai
|
2024-02-23 10:42:51 -08:00 |
|