Ishaan Jaff
|
58d0366447
|
Merge pull request #1399 from BerriAI/litellm_default_use_gunicorn
LiteLLM Proxy - Use Gunicorn with Uvicorn workers
|
2024-01-10 21:46:04 +05:30 |
|
Krrish Dholakia
|
b06d7f0cb6
|
build(config.yml): reintroduce mounting config.yaml
|
2024-01-10 18:03:57 +05:30 |
|
ishaan-jaff
|
2b9174c3d7
|
(feat) add comments on starting with gunicorn
|
2024-01-10 17:50:51 +05:30 |
|
ishaan-jaff
|
67dc9adc71
|
(fix) import gunicorn
|
2024-01-10 17:47:34 +05:30 |
|
ishaan-jaff
|
873965df22
|
(chore) remove old uvicorn logic
|
2024-01-10 17:39:05 +05:30 |
|
ishaan-jaff
|
5136d5980f
|
(fix) use gunicorn to start proxt
|
2024-01-10 17:09:03 +05:30 |
|
ishaan-jaff
|
c7fe33202d
|
v0
|
2024-01-10 16:29:38 +05:30 |
|
ishaan-jaff
|
6786e4f343
|
(feat) allow users to opt into detailed debug on proxy
|
2024-01-08 12:53:41 +05:30 |
|
ishaan-jaff
|
ab90b547d8
|
(fix) proxy - raise error when user missing litellm[proxy]
|
2023-12-28 13:07:44 +05:30 |
|
ishaan-jaff
|
c1a8e30b01
|
(feat) proxy - remove appdirs
|
2023-12-27 17:40:05 +05:30 |
|
ishaan-jaff
|
d273d19bd9
|
(feat) proxy, use --model with --test
|
2023-12-26 09:40:58 +05:30 |
|
Krrish Dholakia
|
4905929de3
|
refactor: add black formatting
|
2023-12-25 14:11:20 +05:30 |
|
ishaan-jaff
|
77bcaaae9e
|
(fix) proxy cli --version
|
2023-12-14 13:22:39 +05:30 |
|
ishaan-jaff
|
241add8b33
|
(feat) proxy add --version
|
2023-12-14 12:28:42 +05:30 |
|
Krrish Dholakia
|
f10bb708c0
|
fix: fix run_ollama_serve to only run if api base is none
|
2023-12-09 21:31:46 -08:00 |
|
Krrish Dholakia
|
ed50522863
|
fix(proxy_server.py): fix pydantic version errors
|
2023-12-09 12:09:49 -08:00 |
|
ishaan-jaff
|
27d7d7ba9c
|
(feat) proxy cli, better description of config yaml param
|
2023-12-05 18:11:29 -08:00 |
|
ishaan-jaff
|
155e99b9a3
|
(fix) prox cli: remove deprecated param
|
2023-12-05 18:04:08 -08:00 |
|
ishaan-jaff
|
9747cc5aad
|
(feat) --health for checking config models
|
2023-11-27 12:13:21 -08:00 |
|
Krrish Dholakia
|
61fc76a8c4
|
fix(router.py): fix caching for tracking cooldowns + usage
|
2023-11-23 11:13:32 -08:00 |
|
Krrish Dholakia
|
826f56a6a0
|
docs(routing.md): update routing docs
|
2023-11-21 19:32:50 -08:00 |
|
Krrish Dholakia
|
9d97082eed
|
docs(routing.md): add queueing to docs
|
2023-11-21 18:01:02 -08:00 |
|
ishaan-jaff
|
2a35ff88a7
|
(fix) proxy server LiteLLM warning
|
2023-11-21 08:50:31 -08:00 |
|
Krrish Dholakia
|
1976d0f7d6
|
fix(routing.py): update token usage on streaming
|
2023-11-20 14:19:25 -08:00 |
|
Krrish Dholakia
|
1738341dcb
|
fix(main.py): misrouting ollama models to nlp cloud
|
2023-11-14 18:55:08 -08:00 |
|
ishaan-jaff
|
e125414611
|
(fix) proxy cli compatible with openai v1.0.0
|
2023-11-13 10:58:20 -08:00 |
|
ishaan-jaff
|
18b694f01a
|
(fix) proxy cli use openai v1.0.0
|
2023-11-13 10:08:48 -08:00 |
|
ishaan-jaff
|
cf0ab7155e
|
(fix) proxy + docs: use openai.chat.completions.create instead of openai.ChatCompletions
|
2023-11-13 08:24:26 -08:00 |
|
ishaan-jaff
|
78e1ed9575
|
(fix) proxy raise exception when config passed in
|
2023-11-10 16:28:34 -08:00 |
|
ishaan-jaff
|
333268c8b7
|
(fix) proxy cli default local debug to False
|
2023-11-09 11:30:11 -08:00 |
|
ishaan-jaff
|
24c0a65347
|
(fix) proxy server clean print statements
|
2023-11-09 11:18:56 -08:00 |
|
ishaan-jaff
|
03940eab8a
|
(fix) prxy server remove create_proxy
|
2023-11-09 11:12:20 -08:00 |
|
ishaan-jaff
|
285c678786
|
(fix) proxy remove --create_proxy
|
2023-11-09 11:10:08 -08:00 |
|
ishaan-jaff
|
fe82e172b9
|
(fix) clean up proxy cli
|
2023-11-08 13:48:02 -08:00 |
|
ishaan-jaff
|
547f41071e
|
(fix) proxy cli --test
|
2023-11-08 12:00:13 -08:00 |
|
ishaan-jaff
|
afea84e0c0
|
(test) run a test again
|
2023-11-08 10:34:57 -08:00 |
|
ishaan-jaff
|
aba39670a3
|
(test) run ci/cd again
|
2023-11-08 10:21:19 -08:00 |
|
ishaan-jaff
|
e0116d2991
|
(fix) proxy server remove bloat
|
2023-11-06 15:55:18 -08:00 |
|
Krrish Dholakia
|
1b87e5a337
|
fix(proxy_server.py): fixing import issues
|
2023-11-05 21:14:59 -08:00 |
|
Krrish Dholakia
|
21ae940992
|
bump: version 0.13.1 → 0.13.2.dev1
|
2023-11-05 21:12:13 -08:00 |
|
Krrish Dholakia
|
3a4370ae20
|
bump: version 0.13.1.dev2 → 0.13.1.dev3
|
2023-11-04 22:31:53 -07:00 |
|
Krrish Dholakia
|
cef67a9beb
|
bump: version 0.13.1.dev1 → 0.13.1.dev2
|
2023-11-04 22:31:53 -07:00 |
|
Krrish Dholakia
|
3b46030eca
|
fix(proxy_cli.py): uvicorn issue
|
2023-11-04 22:31:53 -07:00 |
|
Krrish Dholakia
|
5b3978eff4
|
fix(main.py): fixing print_verbose
|
2023-11-04 14:41:34 -07:00 |
|
Krrish Dholakia
|
6b3671b593
|
fix(proxy_server.py): accept config.yaml
|
2023-11-03 12:50:52 -07:00 |
|
ishaan-jaff
|
4d82c81531
|
(fix) proxy cli tests
|
2023-11-02 21:14:08 -07:00 |
|
ishaan-jaff
|
19737f95c5
|
(feat) proxy add testing for openai.Completion.create
|
2023-11-01 18:25:13 -07:00 |
|
ishaan-jaff
|
c038731c48
|
(add request_timeout) as param to proxy_server
|
2023-10-20 11:55:42 -07:00 |
|
coconut49
|
52fdfe5819
|
Improve code formatting and allow configurable litellm config path via environment variable.
|
2023-10-20 12:19:26 +08:00 |
|
Krrish Dholakia
|
a7c3fc2fd9
|
fix(utils.py): mapping azure api version missing exception
|
2023-10-17 17:12:51 -07:00 |
|