Commit graph

4667 commits

Author SHA1 Message Date
Krrish Dholakia
d9ddd7b338 fix(proxy_server): cleaning up print statements 2023-10-13 20:16:31 -07:00
Krrish Dholakia
3fc0375e34 fix(proxy_server): fix cors issue 2023-10-13 15:47:28 -07:00
Krrish Dholakia
4d4f8bfa5d feat(proxy_server): adding model fallbacks and default model to toml 2023-10-13 15:31:17 -07:00
Krrish Dholakia
2c0280cff3 fix(proxy_cli): add logs and config 2023-10-13 15:14:21 -07:00
Krrish Dholakia
f2eb1b4658 fix(proxy/): remove cloned repo 2023-10-12 21:46:18 -07:00
Krrish Dholakia
06f930a5fb refactor(proxy_server): clean up print statements 2023-10-12 21:39:16 -07:00
Krrish Dholakia
8dc009255b fix(init.py): fix linting errors 2023-10-12 21:31:53 -07:00
Krrish Dholakia
606543eac8 fix(gitmodules): remapping to new proxy 2023-10-12 21:23:53 -07:00
Krrish Dholakia
4f172101df docs(proxy): added readme 2023-10-12 21:09:40 -07:00
Krrish Dholakia
b28c055896 feat(proxy_server): adds create-proxy feature 2023-10-12 18:27:07 -07:00
ishaan-jaff
4b3e4c97b8 (feat) show costs.json in proxy_server.py 2023-10-12 15:07:37 -07:00
ishaan-jaff
e5ae870dd5 (feat) proxy_server new cost.json on litellm. Track daily cost & num_requests 2023-10-12 11:37:35 -07:00
Krrish Dholakia
098a86f678 fix(proxy_cli): prints the location of the config file 2023-10-11 21:19:44 -07:00
Krrish Dholakia
4b0f8825f3 fix(proxy_server): fix prompt template for proxy server 2023-10-11 21:08:42 -07:00
ishaan-jaff
c3101967b6 (fix) proxy_server linting errors 2023-10-11 20:52:03 -07:00
ishaan-jaff
f9f2dcc7ea (fix) set default cost.log 2023-10-11 15:35:53 -07:00
ishaan-jaff
7c81e7449f (feat) proxy_server use fallback port if 8000 occupied 2023-10-11 15:35:14 -07:00
ishaan-jaff
329d27d1fa (feat) proxy_server add /v1 2023-10-11 14:35:05 -07:00
Krrish Dholakia
d0ec844629 fix(proxy_server): import errors 2023-10-11 11:05:31 -07:00
ishaan-jaff
ad8ffa9014 (fix) remove print 2023-10-11 10:59:43 -07:00
Krrish Dholakia
5e5466dfaf fix(proxy_server.py): use tomli instead of tomllib 2023-10-11 10:43:53 -07:00
ishaan-jaff
d40aeec131 (fix) proxy_server add LiteLLM: Running 2023-10-11 08:42:03 -07:00
Krrish Dholakia
ca7e2f6a05 docs(proxy_server.md): add docker image details to docs 2023-10-11 08:28:08 -07:00
Krrish Dholakia
d280a8c434 fix(proxy_cli-and-utils.py): fixing how config file is read + infering llm_provider for known openai endpoints 2023-10-10 20:53:02 -07:00
Krrish Dholakia
1b88868f2c fix(env-template): fixing togetherai api key naming in env template 2023-10-10 18:43:42 -07:00
Krrish Dholakia
1df3f349fe style(proxy_cli): additional tracing 2023-10-10 18:17:57 -07:00
Krrish Dholakia
661ea2359b refactor(proxy_cli): adding additional tracing 2023-10-10 18:12:31 -07:00
Krrish Dholakia
a6f35c8d7a bump: version 0.7.0 → 0.7.1.dev1 2023-10-10 18:07:57 -07:00
Krrish Dholakia
b2b724a35c style(proxy_cli.py): adding feedback box 2023-10-10 13:49:54 -07:00
Krrish Dholakia
b50013386f fix(openai.py): enable custom proxy to pass in ca_bundle_path 2023-10-10 13:23:27 -07:00
ishaan-jaff
7496afdf64 (feat) proxy_server add a track_cost_callback for streaming 2023-10-10 11:33:08 -07:00
ishaan-jaff
e19b4fc114 (feat) proxy_server: begin using callback for tracking costs 2023-10-10 08:35:31 -07:00
ishaan-jaff
ba754a07a3 (feat) add --cost as a flag to the proxy server cli 2023-10-09 15:05:17 -07:00
Krrish Dholakia
42e0d7cf68 fix(proxy_server): returns better error messages for invalid api errors 2023-10-09 15:03:44 -07:00
ishaan-jaff
262f874621 (feat) add cost tracking to proxy server 2023-10-09 14:51:37 -07:00
Krrish Dholakia
a9f7a80e3d feat(proxy_cli.py): add max budget to proxy 2023-10-09 14:11:30 -07:00
Krrish Dholakia
4059f408d0 fix(proxy_cli): accept drop params and add_function_to_prompt 2023-10-09 13:10:07 -07:00
Krrish Dholakia
c3e4c3e3f0 fix(proxy_server.py): add link to docs 2023-10-09 11:35:42 -07:00
Krrish Dholakia
3d809707c0 fix(proxy_cli.py): add drop params and add function to prompt in cli (complete issue)
https://github.com/BerriAI/litellm/issues/557
2023-10-09 11:33:45 -07:00
Sir-Photch
708d61b207 make --test respect host and port 2023-10-08 20:42:49 +02:00
Christoph
64c9795871
Add host cli parameter 2023-10-08 17:59:47 +00:00
Krrish Dholakia
a833e3f929 docs(proxy_server.md): adding /ollama_logs endpoint to docs 2023-10-07 20:38:19 -07:00
Krrish Dholakia
51e5e2b8d5 docs(proxy_server): doc cleanup 2023-10-07 17:29:04 -07:00
ishaan-jaff
051b21b61f (feat) proxy_server display model list when user does not specify model 2023-10-07 17:19:02 -07:00
ishaan-jaff
e987d31028 (feat+fix) proxy_cli max_tokens int, --test stream 2023-10-07 16:38:40 -07:00
Krrish Dholakia
f0c9c24925 fix(proxy_cli.py): check if model passed in 2023-10-07 07:52:02 -07:00
Krrish Dholakia
52b0bcb5ec feat(proxy_cli.py): when user calls ollama model, run ollama serve 2023-10-06 16:46:52 -07:00
Krrish Dholakia
7e34736a38 fix(add-custom-success-callback-for-streaming): add custom success callback for streaming 2023-10-06 15:02:02 -07:00
Krrish Dholakia
e162a9855b fix(proxy_server.py): make completion call handle "v1" in endpoint url 2023-10-06 09:17:02 -07:00
Krrish Dholakia
3ca79a88bb improvements to proxy cli and finish reason mapping for anthropic 2023-09-30 18:09:16 -07:00