Commit graph

3232 commits

Author SHA1 Message Date
ishaan-jaff
67518387f1 (test) ollama test 2023-12-14 19:48:18 +05:30
ishaan-jaff
26c6c1a03e (test) router - acompletion + cache 2023-12-14 19:46:56 +05:30
ishaan-jaff
ce6eb8a30e (test) add ollama to ci/cd 2023-12-14 19:42:44 +05:30
ishaan-jaff
5c40929a9b (test) add ollama tests to our ci/cd 2023-12-14 19:42:25 +05:30
ishaan-jaff
c0cc78b943 (feat) mistral - add exception mapping 2023-12-14 18:57:39 +05:30
ishaan-jaff
ee7e2869cb (test) mistral - safe_mode 2023-12-14 18:42:55 +05:30
ishaan-jaff
e34e4d3b71 (test) mistral api - safe_mode, random_seed 2023-12-14 18:42:43 +05:30
ishaan-jaff
b52ffe1bdf (feat) mistral - add random_seed, safe_mode params 2023-12-14 18:42:00 +05:30
ishaan-jaff
6cda49ac6a (test) mistral api streaming 2023-12-14 18:28:03 +05:30
ishaan-jaff
bfe1e3fc02 (feat) add mistral to proxy + router 2023-12-14 18:20:08 +05:30
ishaan-jaff
0ef7bf237e (test) mistral ai api 2023-12-14 18:19:45 +05:30
ishaan-jaff
7945664e61 (feat) add mistral api 2023-12-14 18:17:48 +05:30
ishaan-jaff
77b26fe9bb (test) test_custom_callback - aembedding cache 2023-12-14 17:36:29 +05:30
ishaan-jaff
9526ee16c4 (fix) aembedding - don't pop aembedding out 2023-12-14 17:13:35 +05:30
Ishaan Jaff
7d8ce6d62e
Merge branch 'main' into fix-langfuse-tests 2023-12-14 16:55:47 +05:30
Ishaan Jaff
cbc8cb08c8
Merge pull request #1112 from Undertone0809/add-cs-return-type-for-completion
refactor: add CustomStreamWrapper return type for completion
2023-12-14 16:52:11 +05:30
ishaan-jaff
d1cf41888b (feat) proxy add docstring for /test 2023-12-14 16:51:16 +05:30
ishaan-jaff
9a35f347ab (test) - caching - override when caching = False 2023-12-14 16:20:29 +05:30
ishaan-jaff
008df34ddc (feat) use async_cache for acompletion/aembedding 2023-12-14 16:04:45 +05:30
ishaan-jaff
a8e12661c2 (fix) caching - get_cache_key - dont use set 2023-12-14 14:09:24 +05:30
ishaan-jaff
ee6b936377 (test) caching - get cache key for embedding 2023-12-14 14:08:58 +05:30
ishaan-jaff
80fc8050eb (feat) proxy - model group alias 2023-12-14 13:24:10 +05:30
ishaan-jaff
77bcaaae9e (fix) proxy cli --version 2023-12-14 13:22:39 +05:30
ishaan-jaff
b70bfbb06f (test) router - test_model_group_aliases 2023-12-14 13:16:44 +05:30
ishaan-jaff
841e941ecd (test) router - model_group_alias 2023-12-14 13:08:35 +05:30
ishaan-jaff
b7a5ab5ffa (feat) proxy - use model_group_alias 2023-12-14 13:08:14 +05:30
ishaan-jaff
1e48f58443 (fix) custom_callback_input test 2023-12-14 12:40:07 +05:30
ishaan-jaff
241add8b33 (feat) proxy add --version 2023-12-14 12:28:42 +05:30
Krrish Dholakia
73e5b96d8e fix(utils.py): support cache logging for async router calls 2023-12-13 19:11:43 -08:00
Krrish Dholakia
cffd190887 test(test_custom_callback_router.py): fix test 2023-12-13 19:06:02 -08:00
Krrish Dholakia
73ecc012a9 docs(embedding.md): add embedding docs to proxy 2023-12-13 18:58:46 -08:00
Krrish Dholakia
8d688b6217 fix(utils.py): support caching for embedding + log cache hits
n

n
2023-12-13 18:37:30 -08:00
Krrish Dholakia
0f29cda8d9 test(test_amazing_vertex_completion.py): fix testing 2023-12-13 16:41:26 -08:00
Krrish Dholakia
e678009695 fix(vertex_ai.py): add exception mapping for acompletion calls 2023-12-13 16:35:50 -08:00
Krrish Dholakia
effdddc1c8 fix(custom_logger.py): enable pre_call hooks to modify incoming data to proxy 2023-12-13 16:20:37 -08:00
Krrish Dholakia
88d09fc5a7 fix(vertex.md): adding gemini-pro support to docs 2023-12-13 14:38:55 -08:00
Krrish Dholakia
7b8851cce5 fix(ollama.py): fix async completion calls for ollama 2023-12-13 13:10:25 -08:00
Mariusz Woloszyn
1feb6317f6 Fix #1119, no content when streaming. 2023-12-13 21:42:35 +01:00
Krrish Dholakia
75bcb37cb2 fix(factory.py): fix tgai rendering template 2023-12-13 12:27:31 -08:00
Krrish Dholakia
69c29f8f86 fix(vertex_ai.py): add support for real async streaming + completion calls 2023-12-13 11:53:55 -08:00
Krrish Dholakia
07015843ac fix(vertex_ai.py): support optional params + enable async calls for gemini 2023-12-13 11:01:23 -08:00
Krrish Dholakia
ef7a6e3ae1 feat(vertex_ai.py): adds support for gemini-pro on vertex ai 2023-12-13 10:26:30 -08:00
ishaan-jaff
86e626edab (feat) pass vertex_ai/ as custom_llm_provider 2023-12-13 19:02:24 +03:00
Krrish Dholakia
d1aef59fbc fix(utils.py): fix stream chunk builder for sync/async success 2023-12-13 07:52:51 -08:00
Krrish Dholakia
a64bd2ca1e fix(sagemaker.py): filter out templated prompt if in model response 2023-12-13 07:43:33 -08:00
zeeland
79ea466cf5 refactor: add CustomStreamWrapper return type for completion 2023-12-13 22:57:19 +08:00
Krrish Dholakia
f9dfeb502a fix(langfuse.py): serialize message for logging 2023-12-12 21:41:05 -08:00
Krrish Dholakia
82d28a8825 fix(factory.py): safely fail prompt template get requests for together ai 2023-12-12 17:28:22 -08:00
Krrish Dholakia
693292a64c feat(proxy_server.py): add new /key/update endpoint 2023-12-12 17:18:51 -08:00
Krrish Dholakia
8e7116635f fix(ollama.py): add support for async streaming 2023-12-12 16:44:20 -08:00