David Christian
|
fe7e3ff038
|
added support for bedrock llama models
|
2023-11-13 15:41:21 -08:00 |
|
Krrish Dholakia
|
d4de55b053
|
fix(together_ai.py): exception mapping for tgai
|
2023-11-13 13:17:15 -08:00 |
|
Krrish Dholakia
|
aa8ca781ba
|
test(test_completion.py): cleanup tests
|
2023-11-13 11:23:38 -08:00 |
|
Krrish Dholakia
|
4340749ea3
|
fix(timeout.py): fix timeout issue
|
2023-11-13 11:07:17 -08:00 |
|
ishaan-jaff
|
1207910522
|
(fix) proxy cli maintain back comp with openai < 1.00
|
2023-11-13 11:06:59 -08:00 |
|
ishaan-jaff
|
e125414611
|
(fix) proxy cli compatible with openai v1.0.0
|
2023-11-13 10:58:20 -08:00 |
|
ishaan-jaff
|
16abdf44c9
|
(fix) proxy_server convert chunk to dict()
|
2023-11-13 10:58:20 -08:00 |
|
Krrish Dholakia
|
1665b872c3
|
fix(caching.py): dump model response object as json
|
2023-11-13 10:41:04 -08:00 |
|
ishaan-jaff
|
4439109658
|
(fix) text completion response
|
2023-11-13 10:29:23 -08:00 |
|
ishaan-jaff
|
18b694f01a
|
(fix) proxy cli use openai v1.0.0
|
2023-11-13 10:08:48 -08:00 |
|
ishaan-jaff
|
a21ff38694
|
(test) deepinfra with openai v1.0.0
|
2023-11-13 09:51:45 -08:00 |
|
ishaan-jaff
|
27cbd7d895
|
(fix) deepinfra with openai v1.0.0
|
2023-11-13 09:51:22 -08:00 |
|
ishaan-jaff
|
cf0ab7155e
|
(fix) proxy + docs: use openai.chat.completions.create instead of openai.ChatCompletions
|
2023-11-13 08:24:26 -08:00 |
|
ishaan-jaff
|
7a3607e00c
|
(test) token_counter
|
2023-11-13 08:02:46 -08:00 |
|
ishaan-jaff
|
c91abc8ad1
|
(fix) token_counter - use openai token counter only for chat completion
|
2023-11-13 08:00:27 -08:00 |
|
ishaan-jaff
|
e5ec4a92fe
|
(test) add test token counter
|
2023-11-13 07:42:08 -08:00 |
|
Krrish Dholakia
|
62013520aa
|
fix(utils.py): replacing openai.error import statements
|
2023-11-11 19:25:21 -08:00 |
|
Krrish Dholakia
|
c5c3096a47
|
build(main.py): trigger testing
|
2023-11-11 19:20:48 -08:00 |
|
Krrish Dholakia
|
45b6f8b853
|
refactor: fixing linting issues
|
2023-11-11 18:52:28 -08:00 |
|
Krrish Dholakia
|
ae35c13015
|
refactor(ai21,-aleph-alpha,-ollama): making ai21, aleph-alpha, ollama compatible with openai v1 sdk
|
2023-11-11 17:49:13 -08:00 |
|
Krrish Dholakia
|
c6ce3fedcd
|
fix(main.py): fix caching for router
|
2023-11-11 17:45:23 -08:00 |
|
Krrish Dholakia
|
4f42beb9d9
|
refactor(huggingface,-anthropic,-replicate,-sagemaker): making huggingface, anthropic, replicate, sagemaker compatible openai v1 sdk
|
2023-11-11 17:38:15 -08:00 |
|
Krrish Dholakia
|
547598a134
|
refactor(bedrock.py-+-cohere.py): making bedrock and cohere compatible with openai v1 sdk
|
2023-11-11 17:33:19 -08:00 |
|
Krrish Dholakia
|
39c2597c33
|
refactor(azure.py): working azure completion calls with openai v1 sdk
|
2023-11-11 16:44:39 -08:00 |
|
Krrish Dholakia
|
d0bd932b3c
|
refactor(openai.py): working openai chat + text completion for openai v1 sdk
|
2023-11-11 16:25:10 -08:00 |
|
Krrish Dholakia
|
d3323ba637
|
refactor(openai.py): making it compatible for openai v1
BREAKING CHANGE:
|
2023-11-11 15:33:02 -08:00 |
|
ishaan-jaff
|
833c38edeb
|
(fix) proxy raise exception when config path does not exist
|
2023-11-11 12:36:22 -08:00 |
|
ishaan-jaff
|
b74b051385
|
(test) tokenizer worked test
|
2023-11-11 12:13:57 -08:00 |
|
ishaan-jaff
|
96bca9a836
|
(test) try/pass APIError for TG AI
|
2023-11-11 11:34:15 -08:00 |
|
ishaan-jaff
|
59c76db34e
|
(fix) add APIError to litellm module
|
2023-11-11 11:33:02 -08:00 |
|
ishaan-jaff
|
d8f735565c
|
(fix) tg ai raise errors on non 200 responses
|
2023-11-11 11:21:12 -08:00 |
|
ishaan-jaff
|
96f0a068e6
|
(test) gpt4 vision
|
2023-11-11 10:34:37 -08:00 |
|
ishaan-jaff
|
4d67cee135
|
(fix) completion gpt-4 vision check finish_details or finish_reason
|
2023-11-11 10:28:20 -08:00 |
|
Ishaan Jaff
|
fd6064b571
|
Merge pull request #787 from duc-phamh/improve_message_trimming
Improve message trimming
|
2023-11-11 09:39:43 -08:00 |
|
ishaan-jaff
|
9d3a28e391
|
(test) async_fn with stream
|
2023-11-10 17:47:15 -08:00 |
|
ishaan-jaff
|
c2c186eb28
|
(test) add gpt-4 vision
|
2023-11-10 17:34:40 -08:00 |
|
ishaan-jaff
|
78e1ed9575
|
(fix) proxy raise exception when config passed in
|
2023-11-10 16:28:34 -08:00 |
|
ishaan-jaff
|
29eac53d76
|
(fix) proxy print exception when reading config
|
2023-11-10 16:22:20 -08:00 |
|
ishaan-jaff
|
9b78bbc6ea
|
(fix) ssl changes
|
2023-11-10 15:57:59 -08:00 |
|
ishaan-jaff
|
f9d4505ea0
|
(fix) ssl for acompletion with openai
|
2023-11-10 15:55:04 -08:00 |
|
Krrish Dholakia
|
41c94d50e2
|
fix(text_completion.py): fix routing logic
|
2023-11-10 15:46:37 -08:00 |
|
Krrish Dholakia
|
697497cdfa
|
test(test_streaming.py): set cache to none
|
2023-11-10 15:24:17 -08:00 |
|
ishaan-jaff
|
2c32f4a588
|
(test) fix caching
|
2023-11-10 15:23:56 -08:00 |
|
Krrish Dholakia
|
91dc9a34e5
|
test(test_langfuse.py): add retries to langfuse test
|
2023-11-10 15:09:21 -08:00 |
|
Krrish Dholakia
|
40edb546dc
|
test(test_streaming.py): set cache to none
|
2023-11-10 15:04:01 -08:00 |
|
Krrish Dholakia
|
abee5a0e05
|
fix(utils.py): caching for embedding
|
2023-11-10 14:33:17 -08:00 |
|
ishaan-jaff
|
af98f74c82
|
(feat) replicate add exception mapping for streaming + better logging when polling
|
2023-11-10 12:46:33 -08:00 |
|
ishaan-jaff
|
1c1a260065
|
(test) replicate with streaming, num retries
|
2023-11-10 12:46:33 -08:00 |
|
ishaan-jaff
|
49c8c8a74f
|
(fix) streaming replicate
|
2023-11-10 12:46:33 -08:00 |
|
ishaan-jaff
|
4c3765119b
|
(fix) replicate print verbose
|
2023-11-10 12:46:33 -08:00 |
|