ishaan-jaff
|
f4a7760ea1
|
(feat+test) use passed OpenAI client
|
2023-11-28 16:09:10 -08:00 |
|
ishaan-jaff
|
8ac7801283
|
(feat) completion:openai pass OpenAI client
|
2023-11-28 16:05:01 -08:00 |
|
ishaan-jaff
|
01c38d37fa
|
(test) pass client to Azure completion
|
2023-11-28 15:57:11 -08:00 |
|
ishaan-jaff
|
400a268934
|
(feat) completion: Azure allow users to pass client to router
|
2023-11-28 15:56:52 -08:00 |
|
ishaan-jaff
|
1a0b683a8e
|
(test) using client: compleition
|
2023-11-28 15:44:56 -08:00 |
|
ishaan-jaff
|
ee6f5a84db
|
(test) load test completion
|
2023-11-28 15:44:56 -08:00 |
|
ishaan-jaff
|
ae7f0ae0b6
|
(feat) proxy: add logs on router performance
|
2023-11-28 15:44:56 -08:00 |
|
ishaan-jaff
|
94d35f1ec5
|
(feat) router: re-use the same client for high trafic
|
2023-11-28 15:44:56 -08:00 |
|
ishaan-jaff
|
7914623fbc
|
(feat) allow users to pass azure client for acmompletion
|
2023-11-28 15:44:56 -08:00 |
|
ishaan-jaff
|
2a69cab550
|
(feat) router track total, success, failed calls per model
|
2023-11-28 15:44:56 -08:00 |
|
Krrish Dholakia
|
20380804b3
|
bump: version 1.7.8 → 1.7.9
|
2023-11-28 14:05:23 -08:00 |
|
Krrish Dholakia
|
4ea52dd571
|
fix(proxy_server.py): support reading master key from os environment
|
2023-11-28 14:05:17 -08:00 |
|
Krrish Dholakia
|
094144de58
|
fix(router.py): removing model id before making call
|
2023-11-28 10:09:45 -08:00 |
|
Krrish Dholakia
|
5ed957ebbe
|
fix(utils.py): bug fix return only non-null responses
|
2023-11-28 09:43:42 -08:00 |
|
Krrish Dholakia
|
150b91d476
|
fix(utils.py): fix streaming on-success logging
|
2023-11-28 09:11:47 -08:00 |
|
Krrish Dholakia
|
cd2883065a
|
docs(stream.md): add stream chunk builder helper function to docs
|
2023-11-28 08:58:35 -08:00 |
|
ishaan-jaff
|
30b80afe31
|
(docs) fix local debugging
|
2023-11-28 08:49:14 -08:00 |
|
ishaan-jaff
|
224a028ab6
|
(fix) completion: AZURE_OPENAI_API_KEY
|
2023-11-28 08:06:06 -08:00 |
|
Krrish Dholakia
|
6149642295
|
refactor(router.py): fix linting errors
|
2023-11-27 22:11:53 -08:00 |
|
Krrish Dholakia
|
82d79638d4
|
refactor(router.py): fix linting errors
|
2023-11-27 22:08:48 -08:00 |
|
Krrish Dholakia
|
c4aea7432f
|
build: adding debug logs to gitignore
|
2023-11-27 22:05:07 -08:00 |
|
ishaan-jaff
|
c52861906b
|
(test) router cooldowns
|
2023-11-27 22:03:02 -08:00 |
|
ishaan-jaff
|
3ca4487e77
|
(feat) proxy set num_retries=3
|
2023-11-27 19:33:59 -08:00 |
|
Krrish Dholakia
|
60a4906505
|
bump: version 1.7.7 → 1.7.8
|
2023-11-27 19:11:48 -08:00 |
|
Krrish Dholakia
|
be9fa06da6
|
fix(main.py): fix linting errors
|
2023-11-27 19:11:38 -08:00 |
|
Krrish Dholakia
|
e8331a4647
|
fix(utils.py): azure tool calling streaming
|
2023-11-27 19:07:38 -08:00 |
|
Krrish Dholakia
|
4cdd930fa2
|
fix(stream_chunk_builder): adding support for tool calling in completion counting
|
2023-11-27 18:39:47 -08:00 |
|
ishaan-jaff
|
40d9e8ab23
|
(test) load test
|
2023-11-27 18:08:47 -08:00 |
|
ishaan-jaff
|
50733363ee
|
(feat) use api_base, api_key as model
|
2023-11-27 18:08:47 -08:00 |
|
ishaan-jaff
|
9cef551623
|
(feat) raise APIConnectionError error for Azure +OpenAI
|
2023-11-27 18:08:47 -08:00 |
|
Krrish Dholakia
|
04f745e314
|
fix(router.py): speed improvements to the router
|
2023-11-27 17:35:26 -08:00 |
|
ishaan-jaff
|
8560794963
|
(test) load test router
|
2023-11-27 16:37:57 -08:00 |
|
ishaan-jaff
|
18d9222945
|
(test) litellm using uuid for model name
|
2023-11-27 16:37:39 -08:00 |
|
ishaan-jaff
|
4265f9b2ef
|
(fix) router: allow same model/name
|
2023-11-27 16:26:09 -08:00 |
|
ishaan-jaff
|
ba228a9e0a
|
(fix) proxy set litellm attributes
|
2023-11-27 13:39:18 -08:00 |
|
ishaan-jaff
|
d7dd9f0307
|
(docs) health check proxy llms
|
2023-11-27 12:20:15 -08:00 |
|
ishaan-jaff
|
5e2c13fb11
|
(test) load test proxy completion
|
2023-11-27 12:13:21 -08:00 |
|
ishaan-jaff
|
9747cc5aad
|
(feat) --health for checking config models
|
2023-11-27 12:13:21 -08:00 |
|
Krrish Dholakia
|
56bb39e52c
|
fix(acompletion): fix acompletion raise exception issue when custom llm provider is none
|
2023-11-27 11:34:48 -08:00 |
|
ishaan-jaff
|
37f3b1edd1
|
(ci/cd) run again
|
2023-11-27 11:12:11 -08:00 |
|
ishaan-jaff
|
9d259d08e7
|
(linting) fix
|
2023-11-27 10:27:51 -08:00 |
|
ishaan-jaff
|
f54c4ee4b8
|
(docs) sagemaker add chat llms
|
2023-11-27 10:24:32 -08:00 |
|
ishaan-jaff
|
18ca445bc2
|
bump: version 1.7.6 → 1.7.7
|
2023-11-27 10:14:03 -08:00 |
|
ishaan-jaff
|
a4754f9098
|
(test) competion
|
2023-11-27 10:13:46 -08:00 |
|
ishaan-jaff
|
26938f697e
|
(feat) completion:debugging - show raw POST request
|
2023-11-27 10:13:37 -08:00 |
|
ishaan-jaff
|
90687d51f1
|
(test) sagemaker add chat models
|
2023-11-27 10:11:56 -08:00 |
|
ishaan-jaff
|
f7ae01da8a
|
(feat) completion:sagemaker - support chat models
|
2023-11-27 10:11:10 -08:00 |
|
ishaan-jaff
|
b732d4c394
|
(feat) model context json: add sagemaker
|
2023-11-27 09:55:53 -08:00 |
|
ishaan-jaff
|
e407b185ee
|
(feat) completion:sagemaker - better debugging
|
2023-11-27 09:08:20 -08:00 |
|
ishaan-jaff
|
d0538e32c9
|
(feat) completion: sagemaker debugging - show boto3 request sent
|
2023-11-27 09:04:50 -08:00 |
|