Krrish Dholakia
|
1e970841a4
|
fix(utils.py): more logging
|
2023-12-12 15:46:27 -08:00 |
|
Krrish Dholakia
|
632d6e0bff
|
fix(utils.py): add more logging
|
2023-12-12 15:46:12 -08:00 |
|
Krrish Dholakia
|
6e87d1ca18
|
fix(utils.py): safe fail complete streaming response
|
2023-12-12 15:46:12 -08:00 |
|
Krrish Dholakia
|
d6669fe9e3
|
fix(utils.py): add more logging
|
2023-12-12 15:46:00 -08:00 |
|
Krrish Dholakia
|
bdf29ca71f
|
fix(sagemaker.py): debug streaming
|
2023-12-12 15:45:07 -08:00 |
|
ishaan-jaff
|
0dd1111cea
|
(fix) assert streaming response = streamchoices()
|
2023-12-12 00:12:57 -08:00 |
|
Krrish Dholakia
|
3e908bf507
|
fix(router.py): reset caching correctly
|
2023-12-11 19:57:34 -08:00 |
|
Krrish Dholakia
|
ad39afc0ad
|
test(test_custom_callback_input.py): embedding callback tests for azure, openai, bedrock
|
2023-12-11 15:32:46 -08:00 |
|
Krrish Dholakia
|
b09ecb986e
|
test(test_custom_callback_input.py): add bedrock testing
n
n
|
2023-12-11 13:00:01 -08:00 |
|
Krrish Dholakia
|
ea89a8a938
|
test(test_custom_callback_unit.py): adding unit tests for custom callbacks + fixing related bugs
|
2023-12-11 11:44:09 -08:00 |
|
ishaan-jaff
|
8cc23b72ec
|
(feat) caching - bedrock
|
2023-12-11 08:43:50 -08:00 |
|
ishaan-jaff
|
ee53139ce2
|
(fix) langfuse test
|
2023-12-09 22:47:52 -08:00 |
|
ishaan-jaff
|
3c8603f148
|
(test) langfuse
|
2023-12-09 22:44:02 -08:00 |
|
ishaan-jaff
|
d18d5a3133
|
(feat) async completion caching
|
2023-12-09 14:22:10 -08:00 |
|
ishaan-jaff
|
5e5ffc3d8a
|
(fix) use cache + custom logger
|
2023-12-09 12:41:26 -08:00 |
|
ishaan-jaff
|
0294e1119e
|
(fix) success_handler / logging
|
2023-12-09 11:38:42 -08:00 |
|
Krrish Dholakia
|
9433c3c11b
|
test: trigger ci/cd build
|
2023-12-09 11:03:19 -08:00 |
|
ishaan-jaff
|
4643a9ac18
|
(test) custom logger + stream + sync compl()
|
2023-12-09 10:22:45 -08:00 |
|
ishaan-jaff
|
c8b699c0aa
|
(feat) custom logger: async stream,assemble chunks
|
2023-12-09 10:10:48 -08:00 |
|
Krrish Dholakia
|
4bf875d3ed
|
fix(router.py): fix least-busy routing
|
2023-12-08 20:29:49 -08:00 |
|
ishaan-jaff
|
88c1d6649f
|
(fix) asyc callback + stream-stop dbl cnt chunk
|
2023-12-08 17:25:05 -08:00 |
|
ishaan-jaff
|
4e8e3ff33a
|
(fix) async custom logger - trigger when stream completed
|
2023-12-08 17:25:05 -08:00 |
|
ishaan-jaff
|
72cca2e5a7
|
(feat) pass model_info, proxy_server_request callback
|
2023-12-08 14:26:18 -08:00 |
|
ishaan-jaff
|
6e8ad10991
|
(feat) caching - streaming caching support
|
2023-12-08 11:50:37 -08:00 |
|
Krrish Dholakia
|
7aec95ed7c
|
feat(proxy_server.py): add sentry logging for db read/writes
|
2023-12-08 11:40:19 -08:00 |
|
Krrish Dholakia
|
1b35736797
|
fix(utils.py): fix cost calculation to handle tool input
|
2023-12-08 09:53:46 -08:00 |
|
ishaan-jaff
|
f99e3a3818
|
(fix) undo commit fd04b48
|
2023-12-07 18:37:06 -08:00 |
|
ishaan-jaff
|
fd04b48764
|
(feat) async callbacks with litellm.completion()
|
2023-12-07 18:09:57 -08:00 |
|
ishaan-jaff
|
762f28e4d7
|
(fix) make print_verbose non blocking
|
2023-12-07 17:31:32 -08:00 |
|
ishaan-jaff
|
2da50087b0
|
(fix) logging - better prints for async logger
|
2023-12-07 17:31:32 -08:00 |
|
Krrish Dholakia
|
3846ec6124
|
fix(utils.py): fix get_llm_provider to handle the ':' in anthropic/bedrock calls
|
2023-12-07 14:19:11 -08:00 |
|
Krrish Dholakia
|
e5638e2c5d
|
fix(router.py): fix default caching response value
|
2023-12-07 13:44:31 -08:00 |
|
ishaan-jaff
|
e9d93c624c
|
(fix) vertex ai - streaming chunks
|
2023-12-07 09:38:37 -08:00 |
|
ishaan-jaff
|
7fcd17cbbe
|
(feat) aembedding callback
|
2023-12-06 19:09:06 -08:00 |
|
Krrish Dholakia
|
c0eedf28fc
|
test: fix proxy server testing
|
2023-12-06 18:38:53 -08:00 |
|
ishaan-jaff
|
8adbf35623
|
(feat) add async loggers under custom logger
|
2023-12-06 17:16:24 -08:00 |
|
ishaan-jaff
|
b3f039627e
|
(feat) litellm - add _async_failure_callback
|
2023-12-06 14:43:47 -08:00 |
|
Krrish Dholakia
|
ff949490de
|
docs(input.md): add hf_model_name to docs
|
2023-12-05 16:56:18 -08:00 |
|
Krrish Dholakia
|
b4c78c7b9e
|
fix(utils.py): support sagemaker llama2 custom endpoints
|
2023-12-05 16:05:15 -08:00 |
|
ishaan-jaff
|
c4bda13820
|
(fix) sagemaker Llama-2 70b
|
2023-12-05 15:32:17 -08:00 |
|
Krrish Dholakia
|
2a02fcbffb
|
fix(utils.py): map cohere finish reasons
|
2023-12-05 12:38:18 -08:00 |
|
Krrish Dholakia
|
ef7795add6
|
fix(utils.py): set text if empty string
|
2023-12-05 12:26:44 -08:00 |
|
Krrish Dholakia
|
88c95ca259
|
fix(_redis.py): support additional params for redis
|
2023-12-05 12:16:51 -08:00 |
|
ishaan-jaff
|
a602d59645
|
(fix) bug in completion: _check_valid_arg
|
2023-12-05 10:00:54 -08:00 |
|
ishaan-jaff
|
732a049513
|
(fix) patch max_retries for non openai llms
|
2023-12-05 09:36:38 -08:00 |
|
Krrish Dholakia
|
e0ccb281d8
|
feat(utils.py): add async success callbacks for custom functions
|
2023-12-04 16:42:40 -08:00 |
|
ishaan-jaff
|
bc691cbbcd
|
(fix) streaming init response_obj as {}
|
2023-12-04 15:19:47 -08:00 |
|
ishaan-jaff
|
3a4e512a75
|
(fix) palm: streaming
|
2023-12-04 15:06:52 -08:00 |
|
Krrish Dholakia
|
728b879c33
|
fix(utils.py): fix azure streaming bug
|
2023-12-04 12:38:22 -08:00 |
|
Krrish Dholakia
|
63e55f1865
|
fix(proxy_server.py): fix /key/generate post endpoint
|
2023-12-04 10:44:13 -08:00 |
|