Krrish Dholakia
|
c673a23769
|
fix(vertex_ai.py): add exception mapping for acompletion calls
|
2023-12-13 16:35:50 -08:00 |
|
Krrish Dholakia
|
2231601d5a
|
fix(ollama.py): fix async completion calls for ollama
|
2023-12-13 13:10:25 -08:00 |
|
Krrish Dholakia
|
13a7f2dd58
|
fix(vertex_ai.py): add support for real async streaming + completion calls
|
2023-12-13 11:53:55 -08:00 |
|
Krrish Dholakia
|
43b160d70d
|
feat(vertex_ai.py): adds support for gemini-pro on vertex ai
|
2023-12-13 10:26:30 -08:00 |
|
Krrish Dholakia
|
56729cc1be
|
fix(utils.py): fix stream chunk builder for sync/async success
|
2023-12-13 07:52:51 -08:00 |
|
Krrish Dholakia
|
1a2eab103a
|
fix(langfuse.py): serialize message for logging
|
2023-12-12 21:41:05 -08:00 |
|
Krrish Dholakia
|
d0e01d7e7a
|
fix(utils.py): flush holding chunk for streaming, on stream end
|
2023-12-12 16:13:31 -08:00 |
|
Krrish Dholakia
|
e396fcb55c
|
fix(main.py): pass user_id + encoding_format for logging + to openai/azure
|
2023-12-12 15:46:44 -08:00 |
|
Krrish Dholakia
|
6be1e0cf6f
|
fix(utils.py): fix logging
n
|
2023-12-12 15:46:36 -08:00 |
|
Krrish Dholakia
|
9bbf1258f8
|
fix(utils.py): more logging
|
2023-12-12 15:46:27 -08:00 |
|
Krrish Dholakia
|
93c7393ae8
|
fix(utils.py): add more logging
|
2023-12-12 15:46:12 -08:00 |
|
Krrish Dholakia
|
0f17a65b01
|
fix(utils.py): safe fail complete streaming response
|
2023-12-12 15:46:12 -08:00 |
|
Krrish Dholakia
|
0a3320ed6b
|
fix(utils.py): add more logging
|
2023-12-12 15:46:00 -08:00 |
|
Krrish Dholakia
|
d059d1b101
|
fix(sagemaker.py): debug streaming
|
2023-12-12 15:45:07 -08:00 |
|
ishaan-jaff
|
4dd5922d80
|
(fix) assert streaming response = streamchoices()
|
2023-12-12 00:12:57 -08:00 |
|
Krrish Dholakia
|
bbf094dcf5
|
fix(router.py): reset caching correctly
|
2023-12-11 19:57:34 -08:00 |
|
Krrish Dholakia
|
02cfefa257
|
test(test_custom_callback_input.py): embedding callback tests for azure, openai, bedrock
|
2023-12-11 15:32:46 -08:00 |
|
Krrish Dholakia
|
72591a2165
|
test(test_custom_callback_input.py): add bedrock testing
n
n
|
2023-12-11 13:00:01 -08:00 |
|
Krrish Dholakia
|
47d0884c0c
|
test(test_custom_callback_unit.py): adding unit tests for custom callbacks + fixing related bugs
|
2023-12-11 11:44:09 -08:00 |
|
ishaan-jaff
|
e775738664
|
(feat) caching - bedrock
|
2023-12-11 08:43:50 -08:00 |
|
ishaan-jaff
|
d5b0635e6e
|
(fix) langfuse test
|
2023-12-09 22:47:52 -08:00 |
|
ishaan-jaff
|
53c18f101c
|
(test) langfuse
|
2023-12-09 22:44:02 -08:00 |
|
ishaan-jaff
|
5e65e2effa
|
(feat) async completion caching
|
2023-12-09 14:22:10 -08:00 |
|
ishaan-jaff
|
53d38cccd8
|
(fix) use cache + custom logger
|
2023-12-09 12:41:26 -08:00 |
|
ishaan-jaff
|
29fc4b7157
|
(fix) success_handler / logging
|
2023-12-09 11:38:42 -08:00 |
|
Krrish Dholakia
|
d4b4030f96
|
test: trigger ci/cd build
|
2023-12-09 11:03:19 -08:00 |
|
ishaan-jaff
|
14e46da274
|
(test) custom logger + stream + sync compl()
|
2023-12-09 10:22:45 -08:00 |
|
ishaan-jaff
|
e056696831
|
(feat) custom logger: async stream,assemble chunks
|
2023-12-09 10:10:48 -08:00 |
|
Krrish Dholakia
|
a65c8919fc
|
fix(router.py): fix least-busy routing
|
2023-12-08 20:29:49 -08:00 |
|
ishaan-jaff
|
fa1eade5f9
|
(fix) asyc callback + stream-stop dbl cnt chunk
|
2023-12-08 17:25:05 -08:00 |
|
ishaan-jaff
|
09e8ad24f6
|
(fix) async custom logger - trigger when stream completed
|
2023-12-08 17:25:05 -08:00 |
|
ishaan-jaff
|
22bb61bca2
|
(feat) pass model_info, proxy_server_request callback
|
2023-12-08 14:26:18 -08:00 |
|
ishaan-jaff
|
e430255794
|
(feat) caching - streaming caching support
|
2023-12-08 11:50:37 -08:00 |
|
Krrish Dholakia
|
4ff969bf6d
|
feat(proxy_server.py): add sentry logging for db read/writes
|
2023-12-08 11:40:19 -08:00 |
|
Krrish Dholakia
|
7c962637f5
|
fix(utils.py): fix cost calculation to handle tool input
|
2023-12-08 09:53:46 -08:00 |
|
ishaan-jaff
|
5071e1ef9f
|
(fix) undo commit fd04b48
|
2023-12-07 18:37:06 -08:00 |
|
ishaan-jaff
|
2f5d56af9f
|
(feat) async callbacks with litellm.completion()
|
2023-12-07 18:09:57 -08:00 |
|
ishaan-jaff
|
f744445db4
|
(fix) make print_verbose non blocking
|
2023-12-07 17:31:32 -08:00 |
|
ishaan-jaff
|
d2a53f05ed
|
(fix) logging - better prints for async logger
|
2023-12-07 17:31:32 -08:00 |
|
Krrish Dholakia
|
55ca691767
|
fix(utils.py): fix get_llm_provider to handle the ':' in anthropic/bedrock calls
|
2023-12-07 14:19:11 -08:00 |
|
Krrish Dholakia
|
69c34493ce
|
fix(router.py): fix default caching response value
|
2023-12-07 13:44:31 -08:00 |
|
ishaan-jaff
|
5769f684ca
|
(fix) vertex ai - streaming chunks
|
2023-12-07 09:38:37 -08:00 |
|
ishaan-jaff
|
b0ad2affb2
|
(feat) aembedding callback
|
2023-12-06 19:09:06 -08:00 |
|
Krrish Dholakia
|
2af2a17bc8
|
test: fix proxy server testing
|
2023-12-06 18:38:53 -08:00 |
|
ishaan-jaff
|
eef3f38b49
|
(feat) add async loggers under custom logger
|
2023-12-06 17:16:24 -08:00 |
|
ishaan-jaff
|
bac8125e5c
|
(feat) litellm - add _async_failure_callback
|
2023-12-06 14:43:47 -08:00 |
|
Krrish Dholakia
|
c01b15af17
|
docs(input.md): add hf_model_name to docs
|
2023-12-05 16:56:18 -08:00 |
|
Krrish Dholakia
|
7e42c64cc5
|
fix(utils.py): support sagemaker llama2 custom endpoints
|
2023-12-05 16:05:15 -08:00 |
|
ishaan-jaff
|
7474cef0b7
|
(fix) sagemaker Llama-2 70b
|
2023-12-05 15:32:17 -08:00 |
|
Krrish Dholakia
|
3455f33230
|
fix(utils.py): map cohere finish reasons
|
2023-12-05 12:38:18 -08:00 |
|