Krrish Dholakia
|
0f62213656
|
fix(utils.py): fix default message object values
|
2024-03-04 21:19:03 -08:00 |
|
Ishaan Jaff
|
f1c39f65d7
|
Merge branch 'main' into litellm_maintain_Claude2_support
|
2024-03-04 21:14:28 -08:00 |
|
Krrish Dholakia
|
caa17d484a
|
fix(bedrock.py): working image calls to claude 3
|
2024-03-04 18:12:47 -08:00 |
|
Krrish Dholakia
|
478307d4cf
|
fix(bedrock.py): support anthropic messages api on bedrock (claude-3)
|
2024-03-04 17:15:47 -08:00 |
|
Krrish Dholakia
|
edda2d9293
|
test(test_completion.py): add testing for anthropic vision calling
|
2024-03-04 13:34:49 -08:00 |
|
ishaan-jaff
|
1183e5f2e5
|
(feat) maintain anthropic text completion
|
2024-03-04 11:16:34 -08:00 |
|
Krrish Dholakia
|
ae82b3f31a
|
feat(anthropic.py): adds tool calling support
|
2024-03-04 10:42:28 -08:00 |
|
Krrish Dholakia
|
873ddde924
|
fix(huggingface_restapi.py): fix huggingface streaming error raising
|
2024-03-04 09:32:41 -08:00 |
|
Ishaan Jaff
|
14fc8355fb
|
Merge pull request #2315 from BerriAI/litellm_add_claude_3
[FEAT]- add claude 3
|
2024-03-04 09:23:13 -08:00 |
|
Ishaan Jaff
|
84415ef7b5
|
Merge pull request #2290 from ti3x/bedrock_mistral
Add support for Bedrock Mistral models
|
2024-03-04 08:42:47 -08:00 |
|
Krrish Dholakia
|
019b9ef6f6
|
fix(utils.py): fix num retries logic
|
2024-03-04 08:01:02 -08:00 |
|
ishaan-jaff
|
fdd8199a25
|
(feat) streaming claude-3
|
2024-03-04 07:29:23 -08:00 |
|
ishaan-jaff
|
19eb9063fb
|
(feat) - add claude 3
|
2024-03-04 07:13:08 -08:00 |
|
Vince Loewe
|
f2a0156e88
|
Merge branch 'main' of github.com:lunary-ai/litellm
|
2024-03-02 11:15:26 -08:00 |
|
Vince Loewe
|
bff53a0698
|
remove useless logging
|
2024-03-02 11:15:21 -08:00 |
|
Tim Xia
|
12d7ea914a
|
update comments
|
2024-03-02 13:34:39 -05:00 |
|
Tim Xia
|
a4e24761a0
|
map optional params
|
2024-03-02 13:25:04 -05:00 |
|
Krish Dholakia
|
ea1e0f5ad9
|
Merge pull request #2292 from BerriAI/litellm_mistral_streaming_error
fix(utils.py): handle mistral streaming error
|
2024-03-02 07:48:14 -08:00 |
|
Mikhail Khludnev
|
381858139e
|
utils.validate_environment to handle OLLAMA_API_BASE env
|
2024-03-02 13:41:59 +03:00 |
|
Krrish Dholakia
|
39037d1e22
|
fix(utils.py): handle mistral streaming error
|
2024-03-01 21:23:10 -08:00 |
|
Vince Loewe
|
05c0fa8b9b
|
Merge branch 'main' into main
|
2024-03-01 13:37:17 -08:00 |
|
Krish Dholakia
|
f9ef3ce32d
|
Merge pull request #2281 from mkhludnev/fix#2260-2261
fix #2260 #2261
|
2024-03-01 13:23:01 -08:00 |
|
Mikhail Khludnev
|
8363e24e58
|
fix #2260 #2261
|
2024-03-01 21:46:57 +03:00 |
|
Krrish Dholakia
|
6b8b0f40cf
|
feat(proxy_server.py): add new team_member delete endpoint
|
2024-03-01 09:14:08 -08:00 |
|
Vince Loewe
|
fa6211616e
|
fix streaming
|
2024-02-29 13:23:51 -08:00 |
|
ishaan-jaff
|
3a661b209a
|
(chore) add mistral azure ai comments
|
2024-02-29 12:04:16 -08:00 |
|
ishaan-jaff
|
38e003bcc0
|
(test) fix mistral tests
|
2024-02-29 12:01:01 -08:00 |
|
ishaan-jaff
|
ec70cdc558
|
(feat) use mistral azure with env vars
|
2024-02-29 08:28:46 -08:00 |
|
ishaan-jaff
|
7c3141a66c
|
(feat) mistral allow setting API base in env
|
2024-02-29 08:15:47 -08:00 |
|
Vince Loewe
|
f98619e6f2
|
Merge branch 'BerriAI:main' into main
|
2024-02-28 22:18:14 -08:00 |
|
ishaan-jaff
|
2771686124
|
(feat) helpers for supports_function_calling
|
2024-02-28 18:15:05 -08:00 |
|
ishaan-jaff
|
e5269fdb7c
|
(feat) support mistral function calling
|
2024-02-28 18:15:05 -08:00 |
|
Krrish Dholakia
|
768ce68fc9
|
fix(utils.py): fix palm exception mapping
|
2024-02-28 18:15:05 -08:00 |
|
ishaan-jaff
|
61377b0c8d
|
(fix) async logging race condition
|
2024-02-28 14:44:02 -08:00 |
|
ishaan-jaff
|
a525fca847
|
(feat) add mistral tool calling support
|
2024-02-28 11:48:20 -08:00 |
|
Vince Loewe
|
4e74582b6f
|
fix timestamps and user
|
2024-02-27 22:30:32 -08:00 |
|
Vince Loewe
|
a9648613dc
|
feat: LLMonitor is now Lunary
|
2024-02-27 22:07:13 -08:00 |
|
Krish Dholakia
|
d34cd7ec9a
|
Merge branch 'main' into litellm_streaming_format_fix
|
2024-02-27 20:16:09 -08:00 |
|
Krrish Dholakia
|
94f4f96994
|
fix(utils.py): fix streaming issue
|
2024-02-27 14:57:50 -08:00 |
|
Krrish Dholakia
|
9f7b322ae2
|
fix(utils.py): map optional params for gemini pro vision
|
2024-02-27 14:45:53 -08:00 |
|
Krrish Dholakia
|
57998c28dc
|
fix(proxy_server.py): drop none values in streaming response
|
2024-02-27 14:37:29 -08:00 |
|
Ishaan Jaff
|
3ff5745333
|
Merge pull request #2216 from BerriAI/litellm_fix_using_mistral_azure_ai
[FIX] using mistral on azure ai studio
|
2024-02-27 08:37:20 -08:00 |
|
ishaan-jaff
|
cfebbdfa7b
|
(fix) support mistral on azure ai studio
|
2024-02-27 06:48:09 -08:00 |
|
zu1k
|
2e75279639
|
fix(utils.py): fix compatibility between together_ai and openai-python
|
2024-02-27 16:38:45 +08:00 |
|
Krish Dholakia
|
365e7ed5b9
|
Merge pull request #2208 from BerriAI/litellm_enforce_team_limits
Litellm enforce team limits
|
2024-02-26 23:10:01 -08:00 |
|
Krrish Dholakia
|
1447621128
|
fix(utils.py): fix redis cache test
|
2024-02-26 22:04:24 -08:00 |
|
Krish Dholakia
|
95b5b7f1fc
|
Merge pull request #2203 from BerriAI/litellm_streaming_caching_fix
fix(utils.py): support returning caching streaming response for function calling streaming calls
|
2024-02-26 19:58:00 -08:00 |
|
Krrish Dholakia
|
2a6a72a0e7
|
fix(utils.py): fixing sync streaming for caching
|
2024-02-26 19:32:30 -08:00 |
|
Krrish Dholakia
|
788e24bd83
|
fix(utils.py): fix streaming logic
|
2024-02-26 14:26:58 -08:00 |
|
Krrish Dholakia
|
5b06627c09
|
fix(utils.py): fix streaming
|
2024-02-26 12:52:53 -08:00 |
|