ishaan-jaff
|
d40aeec131
|
(fix) proxy_server add LiteLLM: Running
|
2023-10-11 08:42:03 -07:00 |
|
Krrish Dholakia
|
ca7e2f6a05
|
docs(proxy_server.md): add docker image details to docs
|
2023-10-11 08:28:08 -07:00 |
|
Krrish Dholakia
|
d280a8c434
|
fix(proxy_cli-and-utils.py): fixing how config file is read + infering llm_provider for known openai endpoints
|
2023-10-10 20:53:02 -07:00 |
|
Krrish Dholakia
|
1b88868f2c
|
fix(env-template): fixing togetherai api key naming in env template
|
2023-10-10 18:43:42 -07:00 |
|
Krrish Dholakia
|
1df3f349fe
|
style(proxy_cli): additional tracing
|
2023-10-10 18:17:57 -07:00 |
|
Krrish Dholakia
|
661ea2359b
|
refactor(proxy_cli): adding additional tracing
|
2023-10-10 18:12:31 -07:00 |
|
Krrish Dholakia
|
a6f35c8d7a
|
bump: version 0.7.0 → 0.7.1.dev1
|
2023-10-10 18:07:57 -07:00 |
|
Krrish Dholakia
|
b2b724a35c
|
style(proxy_cli.py): adding feedback box
|
2023-10-10 13:49:54 -07:00 |
|
Krrish Dholakia
|
b50013386f
|
fix(openai.py): enable custom proxy to pass in ca_bundle_path
|
2023-10-10 13:23:27 -07:00 |
|
ishaan-jaff
|
7496afdf64
|
(feat) proxy_server add a track_cost_callback for streaming
|
2023-10-10 11:33:08 -07:00 |
|
ishaan-jaff
|
e19b4fc114
|
(feat) proxy_server: begin using callback for tracking costs
|
2023-10-10 08:35:31 -07:00 |
|
ishaan-jaff
|
ba754a07a3
|
(feat) add --cost as a flag to the proxy server cli
|
2023-10-09 15:05:17 -07:00 |
|
Krrish Dholakia
|
42e0d7cf68
|
fix(proxy_server): returns better error messages for invalid api errors
|
2023-10-09 15:03:44 -07:00 |
|
ishaan-jaff
|
262f874621
|
(feat) add cost tracking to proxy server
|
2023-10-09 14:51:37 -07:00 |
|
Krrish Dholakia
|
a9f7a80e3d
|
feat(proxy_cli.py): add max budget to proxy
|
2023-10-09 14:11:30 -07:00 |
|
Krrish Dholakia
|
4059f408d0
|
fix(proxy_cli): accept drop params and add_function_to_prompt
|
2023-10-09 13:10:07 -07:00 |
|
Krrish Dholakia
|
c3e4c3e3f0
|
fix(proxy_server.py): add link to docs
|
2023-10-09 11:35:42 -07:00 |
|
Krrish Dholakia
|
3d809707c0
|
fix(proxy_cli.py): add drop params and add function to prompt in cli (complete issue)
https://github.com/BerriAI/litellm/issues/557
|
2023-10-09 11:33:45 -07:00 |
|
Sir-Photch
|
708d61b207
|
make --test respect host and port
|
2023-10-08 20:42:49 +02:00 |
|
Christoph
|
64c9795871
|
Add host cli parameter
|
2023-10-08 17:59:47 +00:00 |
|
Krrish Dholakia
|
a833e3f929
|
docs(proxy_server.md): adding /ollama_logs endpoint to docs
|
2023-10-07 20:38:19 -07:00 |
|
Krrish Dholakia
|
51e5e2b8d5
|
docs(proxy_server): doc cleanup
|
2023-10-07 17:29:04 -07:00 |
|
ishaan-jaff
|
051b21b61f
|
(feat) proxy_server display model list when user does not specify model
|
2023-10-07 17:19:02 -07:00 |
|
ishaan-jaff
|
e987d31028
|
(feat+fix) proxy_cli max_tokens int, --test stream
|
2023-10-07 16:38:40 -07:00 |
|
Krrish Dholakia
|
f0c9c24925
|
fix(proxy_cli.py): check if model passed in
|
2023-10-07 07:52:02 -07:00 |
|
Krrish Dholakia
|
52b0bcb5ec
|
feat(proxy_cli.py): when user calls ollama model, run ollama serve
|
2023-10-06 16:46:52 -07:00 |
|
Krrish Dholakia
|
7e34736a38
|
fix(add-custom-success-callback-for-streaming): add custom success callback for streaming
|
2023-10-06 15:02:02 -07:00 |
|
Krrish Dholakia
|
e162a9855b
|
fix(proxy_server.py): make completion call handle "v1" in endpoint url
|
2023-10-06 09:17:02 -07:00 |
|
Krrish Dholakia
|
3ca79a88bb
|
improvements to proxy cli and finish reason mapping for anthropic
|
2023-09-30 18:09:16 -07:00 |
|
ishaan-jaff
|
9b55152d55
|
use render endpoint for proxy
|
2023-09-30 17:27:03 -07:00 |
|
ishaan-jaff
|
84133f8f45
|
better cli
|
2023-09-30 15:52:44 -07:00 |
|
ishaan-jaff
|
0c640ab5ef
|
cli deploy
|
2023-09-30 15:45:00 -07:00 |
|
ishaan-jaff
|
16942458a2
|
more stuff for cli
|
2023-09-30 15:45:00 -07:00 |
|
ishaan-jaff
|
b2e3d3bf7d
|
add --test to proxy
|
2023-09-30 15:45:00 -07:00 |
|
ishaan-jaff
|
7804aa1ddf
|
use deploy flag
|
2023-09-29 22:03:38 -07:00 |
|
Krrish Dholakia
|
31494796a4
|
fix linting test
|
2023-09-29 21:44:46 -07:00 |
|
ishaan-jaff
|
82c642f78d
|
fix merge conflicts
|
2023-09-29 21:16:06 -07:00 |
|
ishaan-jaff
|
3f7740ddbc
|
add deploy flag to cli
|
2023-09-29 21:14:01 -07:00 |
|
ishaan-jaff
|
71dbd9b61f
|
add --deploy
|
2023-09-29 21:12:03 -07:00 |
|
Krrish Dholakia
|
dc9f02267a
|
update values
|
2023-09-29 20:53:55 -07:00 |
|
ishaan-jaff
|
19b14182b7
|
add --deploy
|
2023-09-29 18:27:47 -07:00 |
|
Krrish Dholakia
|
4665b2a898
|
updates to proxy
|
2023-09-28 17:58:47 -07:00 |
|
Krrish Dholakia
|
09b8c08cad
|
update proxy cli
|
2023-09-28 16:24:41 -07:00 |
|
Krrish Dholakia
|
1a05287461
|
update docs'
|
2023-09-28 13:13:21 -07:00 |
|
Krrish Dholakia
|
d334031108
|
adding support for completions endpoint in proxy
|
2023-09-27 21:04:15 -07:00 |
|
Krrish Dholakia
|
c1fce0859c
|
fix proxy
|
2023-09-26 15:24:44 -07:00 |
|