ishaan-jaff
|
9747cc5aad
|
(feat) --health for checking config models
|
2023-11-27 12:13:21 -08:00 |
|
Krrish Dholakia
|
61fc76a8c4
|
fix(router.py): fix caching for tracking cooldowns + usage
|
2023-11-23 11:13:32 -08:00 |
|
Krrish Dholakia
|
826f56a6a0
|
docs(routing.md): update routing docs
|
2023-11-21 19:32:50 -08:00 |
|
Krrish Dholakia
|
9d97082eed
|
docs(routing.md): add queueing to docs
|
2023-11-21 18:01:02 -08:00 |
|
ishaan-jaff
|
2a35ff88a7
|
(fix) proxy server LiteLLM warning
|
2023-11-21 08:50:31 -08:00 |
|
Krrish Dholakia
|
1976d0f7d6
|
fix(routing.py): update token usage on streaming
|
2023-11-20 14:19:25 -08:00 |
|
Krrish Dholakia
|
1738341dcb
|
fix(main.py): misrouting ollama models to nlp cloud
|
2023-11-14 18:55:08 -08:00 |
|
ishaan-jaff
|
e125414611
|
(fix) proxy cli compatible with openai v1.0.0
|
2023-11-13 10:58:20 -08:00 |
|
ishaan-jaff
|
18b694f01a
|
(fix) proxy cli use openai v1.0.0
|
2023-11-13 10:08:48 -08:00 |
|
ishaan-jaff
|
cf0ab7155e
|
(fix) proxy + docs: use openai.chat.completions.create instead of openai.ChatCompletions
|
2023-11-13 08:24:26 -08:00 |
|
ishaan-jaff
|
78e1ed9575
|
(fix) proxy raise exception when config passed in
|
2023-11-10 16:28:34 -08:00 |
|
ishaan-jaff
|
333268c8b7
|
(fix) proxy cli default local debug to False
|
2023-11-09 11:30:11 -08:00 |
|
ishaan-jaff
|
24c0a65347
|
(fix) proxy server clean print statements
|
2023-11-09 11:18:56 -08:00 |
|
ishaan-jaff
|
03940eab8a
|
(fix) prxy server remove create_proxy
|
2023-11-09 11:12:20 -08:00 |
|
ishaan-jaff
|
285c678786
|
(fix) proxy remove --create_proxy
|
2023-11-09 11:10:08 -08:00 |
|
ishaan-jaff
|
fe82e172b9
|
(fix) clean up proxy cli
|
2023-11-08 13:48:02 -08:00 |
|
ishaan-jaff
|
547f41071e
|
(fix) proxy cli --test
|
2023-11-08 12:00:13 -08:00 |
|
ishaan-jaff
|
afea84e0c0
|
(test) run a test again
|
2023-11-08 10:34:57 -08:00 |
|
ishaan-jaff
|
aba39670a3
|
(test) run ci/cd again
|
2023-11-08 10:21:19 -08:00 |
|
ishaan-jaff
|
e0116d2991
|
(fix) proxy server remove bloat
|
2023-11-06 15:55:18 -08:00 |
|
Krrish Dholakia
|
1b87e5a337
|
fix(proxy_server.py): fixing import issues
|
2023-11-05 21:14:59 -08:00 |
|
Krrish Dholakia
|
21ae940992
|
bump: version 0.13.1 → 0.13.2.dev1
|
2023-11-05 21:12:13 -08:00 |
|
Krrish Dholakia
|
3a4370ae20
|
bump: version 0.13.1.dev2 → 0.13.1.dev3
|
2023-11-04 22:31:53 -07:00 |
|
Krrish Dholakia
|
cef67a9beb
|
bump: version 0.13.1.dev1 → 0.13.1.dev2
|
2023-11-04 22:31:53 -07:00 |
|
Krrish Dholakia
|
3b46030eca
|
fix(proxy_cli.py): uvicorn issue
|
2023-11-04 22:31:53 -07:00 |
|
Krrish Dholakia
|
5b3978eff4
|
fix(main.py): fixing print_verbose
|
2023-11-04 14:41:34 -07:00 |
|
Krrish Dholakia
|
6b3671b593
|
fix(proxy_server.py): accept config.yaml
|
2023-11-03 12:50:52 -07:00 |
|
ishaan-jaff
|
4d82c81531
|
(fix) proxy cli tests
|
2023-11-02 21:14:08 -07:00 |
|
ishaan-jaff
|
19737f95c5
|
(feat) proxy add testing for openai.Completion.create
|
2023-11-01 18:25:13 -07:00 |
|
ishaan-jaff
|
c038731c48
|
(add request_timeout) as param to proxy_server
|
2023-10-20 11:55:42 -07:00 |
|
coconut49
|
52fdfe5819
|
Improve code formatting and allow configurable litellm config path via environment variable.
|
2023-10-20 12:19:26 +08:00 |
|
Krrish Dholakia
|
a7c3fc2fd9
|
fix(utils.py): mapping azure api version missing exception
|
2023-10-17 17:12:51 -07:00 |
|
Krrish Dholakia
|
22937b3b16
|
test(test_proxy.py): adding testing for proxy server
|
2023-10-17 16:29:11 -07:00 |
|
Krrish Dholakia
|
2f57dc8906
|
refactor(proxy_cli.py): code cleanup
|
2023-10-17 13:29:47 -07:00 |
|
coconut49
|
07f06c6479
|
Refactor Dockerfile and proxy_cli.py to use new secrets file location
|
2023-10-18 01:17:03 +08:00 |
|
Krrish Dholakia
|
7358d2e4ea
|
bump: version 0.8.4 → 0.8.5
|
2023-10-14 16:43:06 -07:00 |
|
ishaan-jaff
|
a86a140556
|
(feat) add swagger docs to cli config
|
2023-10-14 12:32:33 -07:00 |
|
ishaan-jaff
|
b6a015404e
|
(feat) proxy_cli cleanup
|
2023-10-14 12:19:56 -07:00 |
|
Krrish Dholakia
|
d6f2d9b9bb
|
docs(custom_callback.md): add details on what kwargs are passed to custom callbacks
|
2023-10-14 11:29:26 -07:00 |
|
Krrish Dholakia
|
3210ebfc7a
|
fix(proxy_cli.py): fix adding keys flow - let user use --add_key to add new keys
|
2023-10-13 22:24:58 -07:00 |
|
Krrish Dholakia
|
d9ddd7b338
|
fix(proxy_server): cleaning up print statements
|
2023-10-13 20:16:31 -07:00 |
|
Krrish Dholakia
|
2c0280cff3
|
fix(proxy_cli): add logs and config
|
2023-10-13 15:14:21 -07:00 |
|
Krrish Dholakia
|
4f172101df
|
docs(proxy): added readme
|
2023-10-12 21:09:40 -07:00 |
|
Krrish Dholakia
|
b28c055896
|
feat(proxy_server): adds create-proxy feature
|
2023-10-12 18:27:07 -07:00 |
|
Krrish Dholakia
|
098a86f678
|
fix(proxy_cli): prints the location of the config file
|
2023-10-11 21:19:44 -07:00 |
|
ishaan-jaff
|
7c81e7449f
|
(feat) proxy_server use fallback port if 8000 occupied
|
2023-10-11 15:35:14 -07:00 |
|
Krrish Dholakia
|
d0ec844629
|
fix(proxy_server): import errors
|
2023-10-11 11:05:31 -07:00 |
|
ishaan-jaff
|
ad8ffa9014
|
(fix) remove print
|
2023-10-11 10:59:43 -07:00 |
|
Krrish Dholakia
|
ca7e2f6a05
|
docs(proxy_server.md): add docker image details to docs
|
2023-10-11 08:28:08 -07:00 |
|
Krrish Dholakia
|
d280a8c434
|
fix(proxy_cli-and-utils.py): fixing how config file is read + infering llm_provider for known openai endpoints
|
2023-10-10 20:53:02 -07:00 |
|