Krrish Dholakia
|
cffd190887
|
test(test_custom_callback_router.py): fix test
|
2023-12-13 19:06:02 -08:00 |
|
Krrish Dholakia
|
73ecc012a9
|
docs(embedding.md): add embedding docs to proxy
|
2023-12-13 18:58:46 -08:00 |
|
Krrish Dholakia
|
8d688b6217
|
fix(utils.py): support caching for embedding + log cache hits
n
n
|
2023-12-13 18:37:30 -08:00 |
|
Krrish Dholakia
|
0f29cda8d9
|
test(test_amazing_vertex_completion.py): fix testing
|
2023-12-13 16:41:26 -08:00 |
|
Krrish Dholakia
|
e678009695
|
fix(vertex_ai.py): add exception mapping for acompletion calls
|
2023-12-13 16:35:50 -08:00 |
|
Krrish Dholakia
|
effdddc1c8
|
fix(custom_logger.py): enable pre_call hooks to modify incoming data to proxy
|
2023-12-13 16:20:37 -08:00 |
|
Krrish Dholakia
|
88d09fc5a7
|
fix(vertex.md): adding gemini-pro support to docs
|
2023-12-13 14:38:55 -08:00 |
|
Krrish Dholakia
|
7b8851cce5
|
fix(ollama.py): fix async completion calls for ollama
|
2023-12-13 13:10:25 -08:00 |
|
Mariusz Woloszyn
|
1feb6317f6
|
Fix #1119, no content when streaming.
|
2023-12-13 21:42:35 +01:00 |
|
Krrish Dholakia
|
75bcb37cb2
|
fix(factory.py): fix tgai rendering template
|
2023-12-13 12:27:31 -08:00 |
|
Krrish Dholakia
|
69c29f8f86
|
fix(vertex_ai.py): add support for real async streaming + completion calls
|
2023-12-13 11:53:55 -08:00 |
|
Krrish Dholakia
|
07015843ac
|
fix(vertex_ai.py): support optional params + enable async calls for gemini
|
2023-12-13 11:01:23 -08:00 |
|
Krrish Dholakia
|
ef7a6e3ae1
|
feat(vertex_ai.py): adds support for gemini-pro on vertex ai
|
2023-12-13 10:26:30 -08:00 |
|
ishaan-jaff
|
86e626edab
|
(feat) pass vertex_ai/ as custom_llm_provider
|
2023-12-13 19:02:24 +03:00 |
|
Krrish Dholakia
|
d1aef59fbc
|
fix(utils.py): fix stream chunk builder for sync/async success
|
2023-12-13 07:52:51 -08:00 |
|
Krrish Dholakia
|
a64bd2ca1e
|
fix(sagemaker.py): filter out templated prompt if in model response
|
2023-12-13 07:43:33 -08:00 |
|
zeeland
|
79ea466cf5
|
refactor: add CustomStreamWrapper return type for completion
|
2023-12-13 22:57:19 +08:00 |
|
Krrish Dholakia
|
f9dfeb502a
|
fix(langfuse.py): serialize message for logging
|
2023-12-12 21:41:05 -08:00 |
|
Krrish Dholakia
|
82d28a8825
|
fix(factory.py): safely fail prompt template get requests for together ai
|
2023-12-12 17:28:22 -08:00 |
|
Krrish Dholakia
|
693292a64c
|
feat(proxy_server.py): add new /key/update endpoint
|
2023-12-12 17:18:51 -08:00 |
|
Krrish Dholakia
|
8e7116635f
|
fix(ollama.py): add support for async streaming
|
2023-12-12 16:44:20 -08:00 |
|
Krrish Dholakia
|
dfdb17ae36
|
test: refactor testing
|
2023-12-12 16:21:41 -08:00 |
|
Krrish Dholakia
|
669862643b
|
fix(utils.py): flush holding chunk for streaming, on stream end
|
2023-12-12 16:13:31 -08:00 |
|
Krrish Dholakia
|
a266ad0319
|
refactor(proxy_server.py): code cleanup
|
2023-12-12 15:49:41 -08:00 |
|
Krrish Dholakia
|
8b07a6c046
|
fix(main.py): pass user_id + encoding_format for logging + to openai/azure
|
2023-12-12 15:46:44 -08:00 |
|
Krrish Dholakia
|
35fa176c97
|
fix(utils.py): fix logging
n
|
2023-12-12 15:46:36 -08:00 |
|
Krrish Dholakia
|
1e970841a4
|
fix(utils.py): more logging
|
2023-12-12 15:46:27 -08:00 |
|
Krrish Dholakia
|
632d6e0bff
|
fix(utils.py): add more logging
|
2023-12-12 15:46:12 -08:00 |
|
Krrish Dholakia
|
6e87d1ca18
|
fix(utils.py): safe fail complete streaming response
|
2023-12-12 15:46:12 -08:00 |
|
Krrish Dholakia
|
d6669fe9e3
|
fix(utils.py): add more logging
|
2023-12-12 15:46:00 -08:00 |
|
Krrish Dholakia
|
bdf29ca71f
|
fix(sagemaker.py): debug streaming
|
2023-12-12 15:45:07 -08:00 |
|
ishaan-jaff
|
a251a52717
|
(chore) remove junk tkinter import
|
2023-12-12 13:54:50 -08:00 |
|
ishaan-jaff
|
6d76878382
|
(fix) pydantic: Field "model_list" has conflict with protected namespace "model_".
|
2023-12-12 12:38:11 -08:00 |
|
ishaan-jaff
|
b6b88370ca
|
(fix) from re import T - junk import
|
2023-12-12 12:26:15 -08:00 |
|
ishaan-jaff
|
eefb2bbf09
|
(ci/cd) run again
|
2023-12-12 12:22:14 -08:00 |
|
ishaan-jaff
|
99b48eff17
|
(fix) tkinter import
|
2023-12-12 12:18:25 -08:00 |
|
Krrish Dholakia
|
902b68dcbd
|
bump: version 1.12.5.dev1 → 1.12.5
|
2023-12-12 11:11:09 -08:00 |
|
Krrish Dholakia
|
edbf97adf2
|
test: testing fixes
|
2023-12-12 10:57:51 -08:00 |
|
Krrish Dholakia
|
9cf5ab468f
|
fix(router.py): deepcopy initial model list, don't mutate it
|
2023-12-12 09:54:06 -08:00 |
|
ishaan-jaff
|
a5dd8b1d4a
|
(fix) use deepcopy for model list
|
2023-12-12 09:53:52 -08:00 |
|
ishaan-jaff
|
f5d64a4992
|
(fix) test router
|
2023-12-12 09:50:44 -08:00 |
|
Krrish Dholakia
|
29c579e9ca
|
test: reinitialize litellm before each test
|
2023-12-12 07:49:06 -08:00 |
|
Max Deichmann
|
c8585a8983
|
fix langfuse tests
|
2023-12-12 12:00:41 +01:00 |
|
Krrish Dholakia
|
dc148c37b0
|
refactor(custom_logger.py): add async log stream event function
|
2023-12-12 00:16:48 -08:00 |
|
Krrish Dholakia
|
8bb01422ee
|
test(test_streaming.py): stricter output format testing
|
2023-12-12 00:16:48 -08:00 |
|
ishaan-jaff
|
0dd1111cea
|
(fix) assert streaming response = streamchoices()
|
2023-12-12 00:12:57 -08:00 |
|
ishaan-jaff
|
01d5875426
|
(test) reset callbacks in completion()
|
2023-12-11 23:35:52 -08:00 |
|
Krrish Dholakia
|
1d42967725
|
test(test_streaming.py): add testing for azure output chunk
|
2023-12-11 23:32:38 -08:00 |
|
ishaan-jaff
|
c89ed8f4c8
|
(ci/cd) test
|
2023-12-11 23:22:40 -08:00 |
|
Krrish Dholakia
|
2c1c75fdf0
|
fix(ollama.py): enable parallel ollama completion calls
|
2023-12-11 23:18:37 -08:00 |
|