ishaan-jaff
|
2f5d56af9f
|
(feat) async callbacks with litellm.completion()
|
2023-12-07 18:09:57 -08:00 |
|
ishaan-jaff
|
f744445db4
|
(fix) make print_verbose non blocking
|
2023-12-07 17:31:32 -08:00 |
|
ishaan-jaff
|
d2a53f05ed
|
(fix) logging - better prints for async logger
|
2023-12-07 17:31:32 -08:00 |
|
Krrish Dholakia
|
55ca691767
|
fix(utils.py): fix get_llm_provider to handle the ':' in anthropic/bedrock calls
|
2023-12-07 14:19:11 -08:00 |
|
Krrish Dholakia
|
69c34493ce
|
fix(router.py): fix default caching response value
|
2023-12-07 13:44:31 -08:00 |
|
ishaan-jaff
|
5769f684ca
|
(fix) vertex ai - streaming chunks
|
2023-12-07 09:38:37 -08:00 |
|
ishaan-jaff
|
b0ad2affb2
|
(feat) aembedding callback
|
2023-12-06 19:09:06 -08:00 |
|
Krrish Dholakia
|
2af2a17bc8
|
test: fix proxy server testing
|
2023-12-06 18:38:53 -08:00 |
|
ishaan-jaff
|
eef3f38b49
|
(feat) add async loggers under custom logger
|
2023-12-06 17:16:24 -08:00 |
|
ishaan-jaff
|
bac8125e5c
|
(feat) litellm - add _async_failure_callback
|
2023-12-06 14:43:47 -08:00 |
|
Krrish Dholakia
|
c01b15af17
|
docs(input.md): add hf_model_name to docs
|
2023-12-05 16:56:18 -08:00 |
|
Krrish Dholakia
|
7e42c64cc5
|
fix(utils.py): support sagemaker llama2 custom endpoints
|
2023-12-05 16:05:15 -08:00 |
|
ishaan-jaff
|
7474cef0b7
|
(fix) sagemaker Llama-2 70b
|
2023-12-05 15:32:17 -08:00 |
|
Krrish Dholakia
|
3455f33230
|
fix(utils.py): map cohere finish reasons
|
2023-12-05 12:38:18 -08:00 |
|
Krrish Dholakia
|
733b1d87a7
|
fix(utils.py): set text if empty string
|
2023-12-05 12:26:44 -08:00 |
|
Krrish Dholakia
|
94abb14b99
|
fix(_redis.py): support additional params for redis
|
2023-12-05 12:16:51 -08:00 |
|
ishaan-jaff
|
cf033be697
|
(fix) bug in completion: _check_valid_arg
|
2023-12-05 10:00:54 -08:00 |
|
ishaan-jaff
|
2038c9816f
|
(fix) patch max_retries for non openai llms
|
2023-12-05 09:36:38 -08:00 |
|
Krrish Dholakia
|
d1a525b6c9
|
feat(utils.py): add async success callbacks for custom functions
|
2023-12-04 16:42:40 -08:00 |
|
ishaan-jaff
|
d518447f80
|
(fix) streaming init response_obj as {}
|
2023-12-04 15:19:47 -08:00 |
|
ishaan-jaff
|
afd07e7f7c
|
(fix) palm: streaming
|
2023-12-04 15:06:52 -08:00 |
|
Krrish Dholakia
|
b76e3ff626
|
fix(utils.py): fix azure streaming bug
|
2023-12-04 12:38:22 -08:00 |
|
Krrish Dholakia
|
813bb15a00
|
fix(proxy_server.py): fix /key/generate post endpoint
|
2023-12-04 10:44:13 -08:00 |
|
ishaan-jaff
|
fe985932be
|
(test) test completion: if 'user' passed to API
|
2023-12-04 09:50:36 -08:00 |
|
Krrish Dholakia
|
5839b2dbec
|
fix(proxy_server.py): support model info augmenting for azure models
|
2023-12-02 21:33:54 -08:00 |
|
Krrish Dholakia
|
00a9150f54
|
fix(main.py): set user to none if not passed in
|
2023-12-02 20:08:25 -08:00 |
|
Krrish Dholakia
|
c85da2dfaa
|
fix(proxy_server.py): update db with master key if set, and fix tracking cost for azure models
|
2023-12-02 15:58:08 -08:00 |
|
Krrish Dholakia
|
284fb64f4d
|
feat: support for azure key vault
|
2023-12-01 19:36:06 -08:00 |
|
Krrish Dholakia
|
1c3aa0e723
|
fix(utils.py): expand openai_token_counter selection
|
2023-11-30 18:51:51 -08:00 |
|
Krrish Dholakia
|
a4308fadce
|
fix(router.py): back-off if no models available
|
2023-11-30 18:42:29 -08:00 |
|
Krrish Dholakia
|
36a3f06320
|
(fix) support counting tokens for tool calls
|
2023-11-30 18:24:21 -08:00 |
|
Krrish Dholakia
|
af56d8a759
|
fix(utils.py): fix azure completion cost calculation
|
2023-11-30 09:19:35 -08:00 |
|
Krrish Dholakia
|
68e0eca6b8
|
fix(utils.py): include system fingerprint in streaming response object
|
2023-11-30 08:45:52 -08:00 |
|
Krrish Dholakia
|
4d232cc19e
|
fix(utils.py): fix register model cost map
|
2023-11-29 21:12:29 -08:00 |
|
Krrish Dholakia
|
0341b0cd07
|
feat(main.py): allow updating model cost via completion()
|
2023-11-29 20:14:39 -08:00 |
|
ishaan-jaff
|
ca03f83597
|
(feat) async embeddings: OpenAI
|
2023-11-29 19:35:08 -08:00 |
|
ishaan-jaff
|
b4173db700
|
(chore) util: remove_model_id
|
2023-11-29 17:30:33 -08:00 |
|
Krrish Dholakia
|
83719a67b2
|
fix(utils.py): raise stop iteration exception on bedrock stream close
|
2023-11-29 16:43:11 -08:00 |
|
Krrish Dholakia
|
2a5592abe7
|
fix(bedrock.py): support ai21 / bedrock streaming
|
2023-11-29 16:35:06 -08:00 |
|
ishaan-jaff
|
8527d98a8f
|
(fix) OpenAI embedding
|
2023-11-29 16:09:31 -08:00 |
|
Krrish Dholakia
|
ddb2e954ac
|
fix(utils.py): stop sequence filtering for amazon titan models
|
2023-11-29 16:04:14 -08:00 |
|
Krrish Dholakia
|
18afb51d72
|
bump: version 1.7.13 → 1.7.14
|
2023-11-29 15:19:18 -08:00 |
|
Krrish Dholakia
|
94de89c812
|
fix(utils.py): return last streaming chunk
|
2023-11-29 12:11:08 -08:00 |
|
Krrish Dholakia
|
622eaf4a21
|
fix(utils.py): fix parallel tool calling when streaming
|
2023-11-29 10:56:21 -08:00 |
|
Krrish Dholakia
|
923c681fb8
|
fix(utils.py): bedrock/cohere optional params
|
2023-11-29 08:08:48 -08:00 |
|
Krrish Dholakia
|
825c3f5d23
|
fix(utils.py): fix bedrock/cohere supported params
|
2023-11-28 17:42:50 -08:00 |
|
Krrish Dholakia
|
60d6b6bc37
|
fix(router.py): fix exponential backoff to use retry-after if present in headers
|
2023-11-28 17:25:03 -08:00 |
|
Krrish Dholakia
|
3c0decee47
|
fix(utils.py): bug fix return only non-null responses
|
2023-11-28 09:43:42 -08:00 |
|
Krrish Dholakia
|
5450fb26f4
|
fix(utils.py): azure tool calling streaming
|
2023-11-27 19:07:38 -08:00 |
|
Krrish Dholakia
|
01a71cd0dc
|
fix(stream_chunk_builder): adding support for tool calling in completion counting
|
2023-11-27 18:39:47 -08:00 |
|