Commit graph

1285 commits

Author SHA1 Message Date
Krrish Dholakia
b28c055896 feat(proxy_server): adds create-proxy feature 2023-10-12 18:27:07 -07:00
ishaan-jaff
4b3e4c97b8 (feat) show costs.json in proxy_server.py 2023-10-12 15:07:37 -07:00
ishaan-jaff
e5ae870dd5 (feat) proxy_server new cost.json on litellm. Track daily cost & num_requests 2023-10-12 11:37:35 -07:00
Krrish Dholakia
098a86f678 fix(proxy_cli): prints the location of the config file 2023-10-11 21:19:44 -07:00
Krrish Dholakia
4b0f8825f3 fix(proxy_server): fix prompt template for proxy server 2023-10-11 21:08:42 -07:00
ishaan-jaff
c3101967b6 (fix) proxy_server linting errors 2023-10-11 20:52:03 -07:00
ishaan-jaff
329d27d1fa (feat) proxy_server add /v1 2023-10-11 14:35:05 -07:00
Krrish Dholakia
d0ec844629 fix(proxy_server): import errors 2023-10-11 11:05:31 -07:00
Krrish Dholakia
5e5466dfaf fix(proxy_server.py): use tomli instead of tomllib 2023-10-11 10:43:53 -07:00
ishaan-jaff
d40aeec131 (fix) proxy_server add LiteLLM: Running 2023-10-11 08:42:03 -07:00
Krrish Dholakia
d280a8c434 fix(proxy_cli-and-utils.py): fixing how config file is read + infering llm_provider for known openai endpoints 2023-10-10 20:53:02 -07:00
Krrish Dholakia
b2b724a35c style(proxy_cli.py): adding feedback box 2023-10-10 13:49:54 -07:00
ishaan-jaff
7496afdf64 (feat) proxy_server add a track_cost_callback for streaming 2023-10-10 11:33:08 -07:00
ishaan-jaff
e19b4fc114 (feat) proxy_server: begin using callback for tracking costs 2023-10-10 08:35:31 -07:00
ishaan-jaff
ba754a07a3 (feat) add --cost as a flag to the proxy server cli 2023-10-09 15:05:17 -07:00
Krrish Dholakia
42e0d7cf68 fix(proxy_server): returns better error messages for invalid api errors 2023-10-09 15:03:44 -07:00
ishaan-jaff
262f874621 (feat) add cost tracking to proxy server 2023-10-09 14:51:37 -07:00
Krrish Dholakia
a9f7a80e3d feat(proxy_cli.py): add max budget to proxy 2023-10-09 14:11:30 -07:00
Krrish Dholakia
c3e4c3e3f0 fix(proxy_server.py): add link to docs 2023-10-09 11:35:42 -07:00
Krrish Dholakia
3d809707c0 fix(proxy_cli.py): add drop params and add function to prompt in cli (complete issue)
https://github.com/BerriAI/litellm/issues/557
2023-10-09 11:33:45 -07:00
Krrish Dholakia
a833e3f929 docs(proxy_server.md): adding /ollama_logs endpoint to docs 2023-10-07 20:38:19 -07:00
Krrish Dholakia
51e5e2b8d5 docs(proxy_server): doc cleanup 2023-10-07 17:29:04 -07:00
ishaan-jaff
051b21b61f (feat) proxy_server display model list when user does not specify model 2023-10-07 17:19:02 -07:00
Krrish Dholakia
7e34736a38 fix(add-custom-success-callback-for-streaming): add custom success callback for streaming 2023-10-06 15:02:02 -07:00
Krrish Dholakia
e162a9855b fix(proxy_server.py): make completion call handle "v1" in endpoint url 2023-10-06 09:17:02 -07:00
Krrish Dholakia
3ca79a88bb improvements to proxy cli and finish reason mapping for anthropic 2023-09-30 18:09:16 -07:00
ishaan-jaff
9b55152d55 use render endpoint for proxy 2023-09-30 17:27:03 -07:00
ishaan-jaff
0c640ab5ef cli deploy 2023-09-30 15:45:00 -07:00
ishaan-jaff
7804aa1ddf use deploy flag 2023-09-29 22:03:38 -07:00
Krrish Dholakia
31494796a4 fix linting test 2023-09-29 21:44:46 -07:00
Krrish Dholakia
dc9f02267a update values 2023-09-29 20:53:55 -07:00
Krrish Dholakia
4665b2a898 updates to proxy 2023-09-28 17:58:47 -07:00
Krrish Dholakia
09b8c08cad update proxy cli 2023-09-28 16:24:41 -07:00
Krrish Dholakia
d334031108 adding support for completions endpoint in proxy 2023-09-27 21:04:15 -07:00
Krrish Dholakia
c1fce0859c fix proxy 2023-09-26 15:24:44 -07:00
Renamed from litellm/proxy_server/proxy_server.py (Browse further)