ishaan-jaff
|
fe82e172b9
|
(fix) clean up proxy cli
|
2023-11-08 13:48:02 -08:00 |
|
ishaan-jaff
|
547f41071e
|
(fix) proxy cli --test
|
2023-11-08 12:00:13 -08:00 |
|
ishaan-jaff
|
afea84e0c0
|
(test) run a test again
|
2023-11-08 10:34:57 -08:00 |
|
ishaan-jaff
|
aba39670a3
|
(test) run ci/cd again
|
2023-11-08 10:21:19 -08:00 |
|
ishaan-jaff
|
e0116d2991
|
(fix) proxy server remove bloat
|
2023-11-06 15:55:18 -08:00 |
|
Krrish Dholakia
|
1b87e5a337
|
fix(proxy_server.py): fixing import issues
|
2023-11-05 21:14:59 -08:00 |
|
Krrish Dholakia
|
21ae940992
|
bump: version 0.13.1 → 0.13.2.dev1
|
2023-11-05 21:12:13 -08:00 |
|
Krrish Dholakia
|
3a4370ae20
|
bump: version 0.13.1.dev2 → 0.13.1.dev3
|
2023-11-04 22:31:53 -07:00 |
|
Krrish Dholakia
|
cef67a9beb
|
bump: version 0.13.1.dev1 → 0.13.1.dev2
|
2023-11-04 22:31:53 -07:00 |
|
Krrish Dholakia
|
3b46030eca
|
fix(proxy_cli.py): uvicorn issue
|
2023-11-04 22:31:53 -07:00 |
|
Krrish Dholakia
|
5b3978eff4
|
fix(main.py): fixing print_verbose
|
2023-11-04 14:41:34 -07:00 |
|
Krrish Dholakia
|
6b3671b593
|
fix(proxy_server.py): accept config.yaml
|
2023-11-03 12:50:52 -07:00 |
|
ishaan-jaff
|
4d82c81531
|
(fix) proxy cli tests
|
2023-11-02 21:14:08 -07:00 |
|
ishaan-jaff
|
19737f95c5
|
(feat) proxy add testing for openai.Completion.create
|
2023-11-01 18:25:13 -07:00 |
|
ishaan-jaff
|
c038731c48
|
(add request_timeout) as param to proxy_server
|
2023-10-20 11:55:42 -07:00 |
|
coconut49
|
52fdfe5819
|
Improve code formatting and allow configurable litellm config path via environment variable.
|
2023-10-20 12:19:26 +08:00 |
|
Krrish Dholakia
|
a7c3fc2fd9
|
fix(utils.py): mapping azure api version missing exception
|
2023-10-17 17:12:51 -07:00 |
|
Krrish Dholakia
|
22937b3b16
|
test(test_proxy.py): adding testing for proxy server
|
2023-10-17 16:29:11 -07:00 |
|
Krrish Dholakia
|
2f57dc8906
|
refactor(proxy_cli.py): code cleanup
|
2023-10-17 13:29:47 -07:00 |
|
coconut49
|
07f06c6479
|
Refactor Dockerfile and proxy_cli.py to use new secrets file location
|
2023-10-18 01:17:03 +08:00 |
|
Krrish Dholakia
|
7358d2e4ea
|
bump: version 0.8.4 → 0.8.5
|
2023-10-14 16:43:06 -07:00 |
|
ishaan-jaff
|
a86a140556
|
(feat) add swagger docs to cli config
|
2023-10-14 12:32:33 -07:00 |
|
ishaan-jaff
|
b6a015404e
|
(feat) proxy_cli cleanup
|
2023-10-14 12:19:56 -07:00 |
|
Krrish Dholakia
|
d6f2d9b9bb
|
docs(custom_callback.md): add details on what kwargs are passed to custom callbacks
|
2023-10-14 11:29:26 -07:00 |
|
Krrish Dholakia
|
3210ebfc7a
|
fix(proxy_cli.py): fix adding keys flow - let user use --add_key to add new keys
|
2023-10-13 22:24:58 -07:00 |
|
Krrish Dholakia
|
d9ddd7b338
|
fix(proxy_server): cleaning up print statements
|
2023-10-13 20:16:31 -07:00 |
|
Krrish Dholakia
|
2c0280cff3
|
fix(proxy_cli): add logs and config
|
2023-10-13 15:14:21 -07:00 |
|
Krrish Dholakia
|
4f172101df
|
docs(proxy): added readme
|
2023-10-12 21:09:40 -07:00 |
|
Krrish Dholakia
|
b28c055896
|
feat(proxy_server): adds create-proxy feature
|
2023-10-12 18:27:07 -07:00 |
|
Krrish Dholakia
|
098a86f678
|
fix(proxy_cli): prints the location of the config file
|
2023-10-11 21:19:44 -07:00 |
|
ishaan-jaff
|
7c81e7449f
|
(feat) proxy_server use fallback port if 8000 occupied
|
2023-10-11 15:35:14 -07:00 |
|
Krrish Dholakia
|
d0ec844629
|
fix(proxy_server): import errors
|
2023-10-11 11:05:31 -07:00 |
|
ishaan-jaff
|
ad8ffa9014
|
(fix) remove print
|
2023-10-11 10:59:43 -07:00 |
|
Krrish Dholakia
|
ca7e2f6a05
|
docs(proxy_server.md): add docker image details to docs
|
2023-10-11 08:28:08 -07:00 |
|
Krrish Dholakia
|
d280a8c434
|
fix(proxy_cli-and-utils.py): fixing how config file is read + infering llm_provider for known openai endpoints
|
2023-10-10 20:53:02 -07:00 |
|
Krrish Dholakia
|
1b88868f2c
|
fix(env-template): fixing togetherai api key naming in env template
|
2023-10-10 18:43:42 -07:00 |
|
Krrish Dholakia
|
1df3f349fe
|
style(proxy_cli): additional tracing
|
2023-10-10 18:17:57 -07:00 |
|
Krrish Dholakia
|
661ea2359b
|
refactor(proxy_cli): adding additional tracing
|
2023-10-10 18:12:31 -07:00 |
|
Krrish Dholakia
|
a6f35c8d7a
|
bump: version 0.7.0 → 0.7.1.dev1
|
2023-10-10 18:07:57 -07:00 |
|
Krrish Dholakia
|
b2b724a35c
|
style(proxy_cli.py): adding feedback box
|
2023-10-10 13:49:54 -07:00 |
|
ishaan-jaff
|
ba754a07a3
|
(feat) add --cost as a flag to the proxy server cli
|
2023-10-09 15:05:17 -07:00 |
|
Krrish Dholakia
|
a9f7a80e3d
|
feat(proxy_cli.py): add max budget to proxy
|
2023-10-09 14:11:30 -07:00 |
|
Krrish Dholakia
|
4059f408d0
|
fix(proxy_cli): accept drop params and add_function_to_prompt
|
2023-10-09 13:10:07 -07:00 |
|
Krrish Dholakia
|
3d809707c0
|
fix(proxy_cli.py): add drop params and add function to prompt in cli (complete issue)
https://github.com/BerriAI/litellm/issues/557
|
2023-10-09 11:33:45 -07:00 |
|
Sir-Photch
|
708d61b207
|
make --test respect host and port
|
2023-10-08 20:42:49 +02:00 |
|
Christoph
|
64c9795871
|
Add host cli parameter
|
2023-10-08 17:59:47 +00:00 |
|
ishaan-jaff
|
e987d31028
|
(feat+fix) proxy_cli max_tokens int, --test stream
|
2023-10-07 16:38:40 -07:00 |
|
Krrish Dholakia
|
f0c9c24925
|
fix(proxy_cli.py): check if model passed in
|
2023-10-07 07:52:02 -07:00 |
|
Krrish Dholakia
|
52b0bcb5ec
|
feat(proxy_cli.py): when user calls ollama model, run ollama serve
|
2023-10-06 16:46:52 -07:00 |
|
Krrish Dholakia
|
7e34736a38
|
fix(add-custom-success-callback-for-streaming): add custom success callback for streaming
|
2023-10-06 15:02:02 -07:00 |
|