Krrish Dholakia
|
6110d32b1c
|
feat(proxy/utils.py): return api base for request hanging alerts
|
2024-04-06 15:58:53 -07:00 |
|
Krrish Dholakia
|
7b30e5ae38
|
fix(utils.py): fix content check in pre-call rules
|
2024-04-06 09:03:19 -07:00 |
|
Krrish Dholakia
|
30f57e7aa5
|
fix(utils.py): move info statement to debug
|
2024-04-05 22:06:46 -07:00 |
|
Ishaan Jaff
|
faa0d38087
|
Merge pull request #2868 from BerriAI/litellm_add_command_r_on_proxy
Add Azure Command-r-plus on litellm proxy
|
2024-04-05 15:13:47 -07:00 |
|
Ishaan Jaff
|
2174b240d8
|
Merge pull request #2861 from BerriAI/litellm_add_azure_command_r_plust
[FEAT] add azure command-r-plus
|
2024-04-05 15:13:35 -07:00 |
|
Ishaan Jaff
|
9055a071e6
|
proxy - add azure/command r
|
2024-04-05 14:35:31 -07:00 |
|
Krish Dholakia
|
a50edef1e6
|
Merge pull request #2856 from lazyhope/anthropic-tools-use-2024-04-04
Support latest Anthropic Tools Use (2024-04-04)
|
2024-04-05 14:31:26 -07:00 |
|
Ishaan Jaff
|
6b9c04618e
|
fix use azure_ai/mistral
|
2024-04-05 10:07:43 -07:00 |
|
Ishaan Jaff
|
5ce80d82d3
|
fix support azure/mistral models
|
2024-04-05 09:32:39 -07:00 |
|
Krrish Dholakia
|
f0c4ff6e60
|
fix(vertex_ai_anthropic.py): support streaming, async completion, async streaming for vertex ai anthropic
|
2024-04-05 09:27:48 -07:00 |
|
Ishaan Jaff
|
71352b1b36
|
fix add azure/command-r-plus
|
2024-04-05 08:53:24 -07:00 |
|
Zihao Li
|
d2cf9d2cf1
|
Move tool definitions from system prompt to parameter and refactor tool calling parse
|
2024-04-05 16:01:40 +08:00 |
|
Nandesh Guru
|
0e9b1f5247
|
Greenscale Integration
Adding logger for Greenscale
|
2024-04-04 15:38:51 -07:00 |
|
Ishaan Jaff
|
9dc4127576
|
v0 return cache key in responses
|
2024-04-04 10:11:18 -07:00 |
|
Krrish Dholakia
|
15e0099948
|
fix(proxy_server.py): return original model response via response headers - /v1/completions
to help devs with debugging
|
2024-04-03 13:05:43 -07:00 |
|
Krrish Dholakia
|
919ec86b2b
|
fix(openai.py): switch to using openai sdk for text completion calls
|
2024-04-02 15:08:12 -07:00 |
|
Krrish Dholakia
|
b07788d2a5
|
fix(openai.py): return logprobs for text completion calls
|
2024-04-02 14:05:56 -07:00 |
|
Krrish Dholakia
|
0d949d71ab
|
fix(main.py): support text completion input being a list of strings
addresses - https://github.com/BerriAI/litellm/issues/2792, https://github.com/BerriAI/litellm/issues/2777
|
2024-04-02 08:50:16 -07:00 |
|
Sebastián Estévez
|
e50e76bbd5
|
support cohere_chat in get_api_key
|
2024-04-01 13:24:03 -04:00 |
|
Ishaan Jaff
|
c365de122a
|
check num retries in async wrapper
|
2024-03-30 19:33:40 -07:00 |
|
Ishaan Jaff
|
bd95626579
|
(fix) improve async perf
|
2024-03-30 19:07:04 -07:00 |
|
Vincelwt
|
1b84dfac91
|
Merge branch 'main' into main
|
2024-03-30 13:21:53 +09:00 |
|
Ishaan Jaff
|
24570bc075
|
(docs) grafana / prometheus
|
2024-03-29 14:25:45 -07:00 |
|
Krrish Dholakia
|
41fb76740e
|
fix(utils.py): exception mapping on 'next()' streaming error
|
2024-03-29 09:18:41 -07:00 |
|
Krrish Dholakia
|
cd53291b62
|
fix(utils.py): support bedrock mistral streaming
|
2024-03-29 07:56:10 -07:00 |
|
Krrish Dholakia
|
5d428ac94c
|
fix(utils.py): don't add chatml tokens to a simple text token count
|
2024-03-28 13:48:48 -07:00 |
|
Krrish Dholakia
|
2926d5a8eb
|
fix(proxy/utils.py): check cache before alerting user
|
2024-03-27 20:09:15 -07:00 |
|
Krrish Dholakia
|
9b7383ac67
|
fix(utils.py): don't run post-call rules on a coroutine function
|
2024-03-27 13:16:27 -07:00 |
|
Krish Dholakia
|
0ab708e6f1
|
Merge pull request #2704 from BerriAI/litellm_jwt_auth_improvements_3
fix(handle_jwt.py): enable team-based jwt-auth access
|
2024-03-26 16:06:56 -07:00 |
|
Krrish Dholakia
|
4281f1545b
|
fix(utils.py): check if item in list is pydantic object or dict before dereferencing
|
2024-03-26 14:39:16 -07:00 |
|
Krrish Dholakia
|
4d7f4550e2
|
test(test_batch_completions.py): handle anthropic overloaded error
|
2024-03-26 13:55:03 -07:00 |
|
Krrish Dholakia
|
3a82ff2ef2
|
fix(utils.py): don't send subsequent chunks if last chunk sent
prevents multiple empty finish chunks from being sent
|
2024-03-26 13:49:42 -07:00 |
|
Ishaan Jaff
|
da503eab18
|
Merge branch 'main' into litellm_remove_litellm_telemetry
|
2024-03-26 11:35:02 -07:00 |
|
Ishaan Jaff
|
6b4b05b58f
|
(fix) remove litellm.telemetry
|
2024-03-26 11:21:09 -07:00 |
|
Krrish Dholakia
|
584d187e0e
|
fix(utils.py): check if message is pydantic object or dict before dereferencing
|
2024-03-26 09:47:44 -07:00 |
|
Krrish Dholakia
|
2dd2b8a8e3
|
test(test_streaming.py): add unit testing for custom stream wrapper
|
2024-03-26 08:57:44 -07:00 |
|
Krish Dholakia
|
6f11f300fc
|
Merge pull request #2656 from TashaSkyUp/patch-1
fix for: when using ModelResponse.json() to save and then reconstruct a ModelResponse the choices field ends up empty
|
2024-03-26 08:36:55 -07:00 |
|
Ishaan Jaff
|
81b716d8da
|
(fix) cache control logic
|
2024-03-26 07:36:45 -07:00 |
|
Ishaan Jaff
|
965fb6eb2c
|
(fix) cache control logic
|
2024-03-25 22:19:34 -07:00 |
|
Krrish Dholakia
|
1c55f2ccc5
|
fix(utils.py): persist system fingerprint across chunks
|
2024-03-25 19:24:09 -07:00 |
|
Krrish Dholakia
|
bd75498913
|
fix(utils.py): log success event for streaming
|
2024-03-25 19:03:10 -07:00 |
|
Krrish Dholakia
|
1ac641165b
|
fix(utils.py): persist response id across chunks
|
2024-03-25 18:20:43 -07:00 |
|
Krrish Dholakia
|
dc2c4af631
|
fix(utils.py): fix text completion streaming
|
2024-03-25 16:47:17 -07:00 |
|
Krrish Dholakia
|
9e1e97528d
|
fix(utils.py): ensure last chunk is always empty delta w/ finish reason
makes sure we're openai-compatible with our streaming. Adds stricter tests for this as well
|
2024-03-25 16:33:41 -07:00 |
|
Krrish Dholakia
|
f153889738
|
fix(utils.py): allow user to disable streaming logging
fixes event loop issue for litellm.disable_streaming_logging
|
2024-03-25 14:28:46 -07:00 |
|
Max Deichmann
|
efb43ccd02
|
push
|
2024-03-25 17:43:55 +01:00 |
|
Krrish Dholakia
|
eb3ca85d7e
|
feat(router.py): enable pre-call checks
filter models outside of context window limits of a given message for a model group
https://github.com/BerriAI/litellm/issues/872
|
2024-03-23 18:03:30 -07:00 |
|
Tasha Upchurch
|
ab919004a2
|
Update utils.py
fix for constructed from dict choices.message being a dict still instead of Message class.
|
2024-03-23 00:12:24 -04:00 |
|
Ishaan Jaff
|
f39f606e02
|
(feat) remove litellm.telemetry
|
2024-03-22 20:58:14 -07:00 |
|
Tasha Upchurch
|
79201449d2
|
Update utils.py
Fix for creating an empty choices if no choices passed in
|
2024-03-22 23:39:17 -04:00 |
|