coconut49
|
cfeaa79bea
|
merge
|
2023-10-18 01:47:56 +08:00 |
|
coconut49
|
2b02400eb8
|
Refactor proxy_server.py to simplify v1 endpoints and improve logging
|
2023-10-18 00:35:51 +08:00 |
|
Krrish Dholakia
|
44cafb5bac
|
docs(proxy_server.md): update proxy server docs to include multi-agent autogen tutorial
|
2023-10-17 09:22:34 -07:00 |
|
coconut49
|
8c0a3473a1
|
Refactor proxy_server.py for readability and code consistency
|
2023-10-17 23:48:55 +08:00 |
|
Krrish Dholakia
|
f221eac41a
|
fix(proxy_server): improve error handling
|
2023-10-16 19:42:53 -07:00 |
|
Krrish Dholakia
|
5894614e89
|
refactor(proxy_server.py): code cleanup
|
2023-10-16 16:47:33 -07:00 |
|
canada4663
|
8a68957f6e
|
updated proxy_server.py /models endpoint with the results of get_valid_models()
|
2023-10-14 20:54:47 -07:00 |
|
Krrish Dholakia
|
5f9dd0b21f
|
bump: version 0.8.4 → 0.8.5
|
2023-10-14 16:43:06 -07:00 |
|
Krrish Dholakia
|
bfac8b5a76
|
docs(proxy_server.md): add logs, save keys, model fallbacks, config file template to proxy server docs
|
2023-10-14 10:52:01 -07:00 |
|
Krrish Dholakia
|
29877b35b0
|
fix(proxy_cli.py): fix adding keys flow - let user use --add_key to add new keys
|
2023-10-13 22:24:58 -07:00 |
|
Krrish Dholakia
|
ec5e7aa4a9
|
fix(openai.p): adding support for exception mapping for openai-compatible apis via http calls
|
2023-10-13 21:56:51 -07:00 |
|
ishaan-jaff
|
99b51829b4
|
(feat) add swagger.json for litellm proxy
|
2023-10-13 20:41:04 -07:00 |
|
Krrish Dholakia
|
a27e9aaf8d
|
fix(proxy_server): cleaning up print statements
|
2023-10-13 20:16:31 -07:00 |
|
Krrish Dholakia
|
1afd53f452
|
fix(proxy_server): fix cors issue
|
2023-10-13 15:47:28 -07:00 |
|
Krrish Dholakia
|
74c0d5b7a0
|
feat(proxy_server): adding model fallbacks and default model to toml
|
2023-10-13 15:31:17 -07:00 |
|
Krrish Dholakia
|
90c8b3a193
|
fix(proxy_cli): add logs and config
|
2023-10-13 15:14:21 -07:00 |
|
Krrish Dholakia
|
46f3b9a68b
|
refactor(proxy_server): clean up print statements
|
2023-10-12 21:39:16 -07:00 |
|
Krrish Dholakia
|
04276e5c86
|
fix(gitmodules): remapping to new proxy
|
2023-10-12 21:23:53 -07:00 |
|
Krrish Dholakia
|
7ec5351305
|
docs(proxy): added readme
|
2023-10-12 21:09:40 -07:00 |
|
Krrish Dholakia
|
d0b4dfd26c
|
feat(proxy_server): adds create-proxy feature
|
2023-10-12 18:27:07 -07:00 |
|
ishaan-jaff
|
873b15e3cd
|
(feat) show costs.json in proxy_server.py
|
2023-10-12 15:07:37 -07:00 |
|
ishaan-jaff
|
ab16d00061
|
(feat) proxy_server new cost.json on litellm. Track daily cost & num_requests
|
2023-10-12 11:37:35 -07:00 |
|
Krrish Dholakia
|
e150dc2e8e
|
fix(proxy_cli): prints the location of the config file
|
2023-10-11 21:19:44 -07:00 |
|
Krrish Dholakia
|
778648e84f
|
fix(proxy_server): fix prompt template for proxy server
|
2023-10-11 21:08:42 -07:00 |
|
ishaan-jaff
|
c60f9db97a
|
(fix) proxy_server linting errors
|
2023-10-11 20:52:03 -07:00 |
|
ishaan-jaff
|
c4f816da36
|
(feat) proxy_server add /v1
|
2023-10-11 14:35:05 -07:00 |
|
Krrish Dholakia
|
a7e4f743cd
|
fix(proxy_server): import errors
|
2023-10-11 11:05:31 -07:00 |
|
Krrish Dholakia
|
8711cd91c5
|
fix(proxy_server.py): use tomli instead of tomllib
|
2023-10-11 10:43:53 -07:00 |
|
ishaan-jaff
|
2a3352faf4
|
(fix) proxy_server add LiteLLM: Running
|
2023-10-11 08:42:03 -07:00 |
|
Krrish Dholakia
|
87e5f79924
|
fix(proxy_cli-and-utils.py): fixing how config file is read + infering llm_provider for known openai endpoints
|
2023-10-10 20:53:02 -07:00 |
|
Krrish Dholakia
|
d43160ab8a
|
style(proxy_cli.py): adding feedback box
|
2023-10-10 13:49:54 -07:00 |
|
ishaan-jaff
|
1bb8faa607
|
(feat) proxy_server add a track_cost_callback for streaming
|
2023-10-10 11:33:08 -07:00 |
|
ishaan-jaff
|
efcecb21c3
|
(feat) proxy_server: begin using callback for tracking costs
|
2023-10-10 08:35:31 -07:00 |
|
ishaan-jaff
|
c5ea2b4ebe
|
(feat) add --cost as a flag to the proxy server cli
|
2023-10-09 15:05:17 -07:00 |
|
Krrish Dholakia
|
5ff4dbaa5c
|
fix(proxy_server): returns better error messages for invalid api errors
|
2023-10-09 15:03:44 -07:00 |
|
ishaan-jaff
|
e559e79fd6
|
(feat) add cost tracking to proxy server
|
2023-10-09 14:51:37 -07:00 |
|
Krrish Dholakia
|
4d34bfe68a
|
feat(proxy_cli.py): add max budget to proxy
|
2023-10-09 14:11:30 -07:00 |
|
Krrish Dholakia
|
51b0a530ff
|
fix(proxy_server.py): add link to docs
|
2023-10-09 11:35:42 -07:00 |
|
Krrish Dholakia
|
ce1b7e244e
|
fix(proxy_cli.py): add drop params and add function to prompt in cli (complete issue)
https://github.com/BerriAI/litellm/issues/557
|
2023-10-09 11:33:45 -07:00 |
|
Krrish Dholakia
|
3f7f7fd886
|
docs(proxy_server.md): adding /ollama_logs endpoint to docs
|
2023-10-07 20:38:19 -07:00 |
|
Krrish Dholakia
|
7339461971
|
docs(proxy_server): doc cleanup
|
2023-10-07 17:29:04 -07:00 |
|
ishaan-jaff
|
80d3016117
|
(feat) proxy_server display model list when user does not specify model
|
2023-10-07 17:19:02 -07:00 |
|
Krrish Dholakia
|
5ab3a4b8d7
|
fix(add-custom-success-callback-for-streaming): add custom success callback for streaming
|
2023-10-06 15:02:02 -07:00 |
|
Krrish Dholakia
|
666623fcf3
|
fix(proxy_server.py): make completion call handle "v1" in endpoint url
|
2023-10-06 09:17:02 -07:00 |
|
Krrish Dholakia
|
eec2c848f1
|
improvements to proxy cli and finish reason mapping for anthropic
|
2023-09-30 18:09:16 -07:00 |
|
ishaan-jaff
|
148ea7edd0
|
use render endpoint for proxy
|
2023-09-30 17:27:03 -07:00 |
|
ishaan-jaff
|
25586faaf0
|
cli deploy
|
2023-09-30 15:45:00 -07:00 |
|
ishaan-jaff
|
dbe784995f
|
use deploy flag
|
2023-09-29 22:03:38 -07:00 |
|
Krrish Dholakia
|
a86e6e1a8c
|
fix linting test
|
2023-09-29 21:44:46 -07:00 |
|
Krrish Dholakia
|
e258aa6f37
|
update values
|
2023-09-29 20:53:55 -07:00 |
|