Commit graph

734 commits

Author SHA1 Message Date
Krrish Dholakia
73e5b96d8e fix(utils.py): support cache logging for async router calls 2023-12-13 19:11:43 -08:00
Krrish Dholakia
8d688b6217 fix(utils.py): support caching for embedding + log cache hits
n

n
2023-12-13 18:37:30 -08:00
Krrish Dholakia
e678009695 fix(vertex_ai.py): add exception mapping for acompletion calls 2023-12-13 16:35:50 -08:00
Krrish Dholakia
7b8851cce5 fix(ollama.py): fix async completion calls for ollama 2023-12-13 13:10:25 -08:00
Krrish Dholakia
69c29f8f86 fix(vertex_ai.py): add support for real async streaming + completion calls 2023-12-13 11:53:55 -08:00
Krrish Dholakia
ef7a6e3ae1 feat(vertex_ai.py): adds support for gemini-pro on vertex ai 2023-12-13 10:26:30 -08:00
Krrish Dholakia
d1aef59fbc fix(utils.py): fix stream chunk builder for sync/async success 2023-12-13 07:52:51 -08:00
Krrish Dholakia
f9dfeb502a fix(langfuse.py): serialize message for logging 2023-12-12 21:41:05 -08:00
Krrish Dholakia
669862643b fix(utils.py): flush holding chunk for streaming, on stream end 2023-12-12 16:13:31 -08:00
Krrish Dholakia
8b07a6c046 fix(main.py): pass user_id + encoding_format for logging + to openai/azure 2023-12-12 15:46:44 -08:00
Krrish Dholakia
35fa176c97 fix(utils.py): fix logging
n
2023-12-12 15:46:36 -08:00
Krrish Dholakia
1e970841a4 fix(utils.py): more logging 2023-12-12 15:46:27 -08:00
Krrish Dholakia
632d6e0bff fix(utils.py): add more logging 2023-12-12 15:46:12 -08:00
Krrish Dholakia
6e87d1ca18 fix(utils.py): safe fail complete streaming response 2023-12-12 15:46:12 -08:00
Krrish Dholakia
d6669fe9e3 fix(utils.py): add more logging 2023-12-12 15:46:00 -08:00
Krrish Dholakia
bdf29ca71f fix(sagemaker.py): debug streaming 2023-12-12 15:45:07 -08:00
ishaan-jaff
0dd1111cea (fix) assert streaming response = streamchoices() 2023-12-12 00:12:57 -08:00
Krrish Dholakia
3e908bf507 fix(router.py): reset caching correctly 2023-12-11 19:57:34 -08:00
Krrish Dholakia
ad39afc0ad test(test_custom_callback_input.py): embedding callback tests for azure, openai, bedrock 2023-12-11 15:32:46 -08:00
Krrish Dholakia
b09ecb986e test(test_custom_callback_input.py): add bedrock testing
n

n
2023-12-11 13:00:01 -08:00
Krrish Dholakia
ea89a8a938 test(test_custom_callback_unit.py): adding unit tests for custom callbacks + fixing related bugs 2023-12-11 11:44:09 -08:00
ishaan-jaff
8cc23b72ec (feat) caching - bedrock 2023-12-11 08:43:50 -08:00
ishaan-jaff
ee53139ce2 (fix) langfuse test 2023-12-09 22:47:52 -08:00
ishaan-jaff
3c8603f148 (test) langfuse 2023-12-09 22:44:02 -08:00
ishaan-jaff
d18d5a3133 (feat) async completion caching 2023-12-09 14:22:10 -08:00
ishaan-jaff
5e5ffc3d8a (fix) use cache + custom logger 2023-12-09 12:41:26 -08:00
ishaan-jaff
0294e1119e (fix) success_handler / logging 2023-12-09 11:38:42 -08:00
Krrish Dholakia
9433c3c11b test: trigger ci/cd build 2023-12-09 11:03:19 -08:00
ishaan-jaff
4643a9ac18 (test) custom logger + stream + sync compl() 2023-12-09 10:22:45 -08:00
ishaan-jaff
c8b699c0aa (feat) custom logger: async stream,assemble chunks 2023-12-09 10:10:48 -08:00
Krrish Dholakia
4bf875d3ed fix(router.py): fix least-busy routing 2023-12-08 20:29:49 -08:00
ishaan-jaff
88c1d6649f (fix) asyc callback + stream-stop dbl cnt chunk 2023-12-08 17:25:05 -08:00
ishaan-jaff
4e8e3ff33a (fix) async custom logger - trigger when stream completed 2023-12-08 17:25:05 -08:00
ishaan-jaff
72cca2e5a7 (feat) pass model_info, proxy_server_request callback 2023-12-08 14:26:18 -08:00
ishaan-jaff
6e8ad10991 (feat) caching - streaming caching support 2023-12-08 11:50:37 -08:00
Krrish Dholakia
7aec95ed7c feat(proxy_server.py): add sentry logging for db read/writes 2023-12-08 11:40:19 -08:00
Krrish Dholakia
1b35736797 fix(utils.py): fix cost calculation to handle tool input 2023-12-08 09:53:46 -08:00
ishaan-jaff
f99e3a3818 (fix) undo commit fd04b48 2023-12-07 18:37:06 -08:00
ishaan-jaff
fd04b48764 (feat) async callbacks with litellm.completion() 2023-12-07 18:09:57 -08:00
ishaan-jaff
762f28e4d7 (fix) make print_verbose non blocking 2023-12-07 17:31:32 -08:00
ishaan-jaff
2da50087b0 (fix) logging - better prints for async logger 2023-12-07 17:31:32 -08:00
Krrish Dholakia
3846ec6124 fix(utils.py): fix get_llm_provider to handle the ':' in anthropic/bedrock calls 2023-12-07 14:19:11 -08:00
Krrish Dholakia
e5638e2c5d fix(router.py): fix default caching response value 2023-12-07 13:44:31 -08:00
ishaan-jaff
e9d93c624c (fix) vertex ai - streaming chunks 2023-12-07 09:38:37 -08:00
ishaan-jaff
7fcd17cbbe (feat) aembedding callback 2023-12-06 19:09:06 -08:00
Krrish Dholakia
c0eedf28fc test: fix proxy server testing 2023-12-06 18:38:53 -08:00
ishaan-jaff
8adbf35623 (feat) add async loggers under custom logger 2023-12-06 17:16:24 -08:00
ishaan-jaff
b3f039627e (feat) litellm - add _async_failure_callback 2023-12-06 14:43:47 -08:00
Krrish Dholakia
ff949490de docs(input.md): add hf_model_name to docs 2023-12-05 16:56:18 -08:00
Krrish Dholakia
b4c78c7b9e fix(utils.py): support sagemaker llama2 custom endpoints 2023-12-05 16:05:15 -08:00