ishaan-jaff
|
f9f2dcc7ea
|
(fix) set default cost.log
|
2023-10-11 15:35:53 -07:00 |
|
ishaan-jaff
|
7c81e7449f
|
(feat) proxy_server use fallback port if 8000 occupied
|
2023-10-11 15:35:14 -07:00 |
|
ishaan-jaff
|
329d27d1fa
|
(feat) proxy_server add /v1
|
2023-10-11 14:35:05 -07:00 |
|
Krrish Dholakia
|
e26f98dce2
|
fix(utils): don't wait for thread to complete to return response
|
2023-10-11 14:23:55 -07:00 |
|
ishaan-jaff
|
ae096bb18e
|
(test) add testing for litellm callback for supabase
|
2023-10-11 12:34:10 -07:00 |
|
ishaan-jaff
|
cc55bc886a
|
(feat) upgrade supabase callback + support logging streaming on supabase
|
2023-10-11 12:34:10 -07:00 |
|
Krrish Dholakia
|
d0ec844629
|
fix(proxy_server): import errors
|
2023-10-11 11:05:31 -07:00 |
|
ishaan-jaff
|
2945918414
|
(fix) supabase test
|
2023-10-11 11:02:06 -07:00 |
|
ishaan-jaff
|
52b43ef8c1
|
bump: version 0.7.1 → 0.7.2
|
2023-10-11 11:01:10 -07:00 |
|
ishaan-jaff
|
ad8ffa9014
|
(fix) remove print
|
2023-10-11 10:59:43 -07:00 |
|
Krrish Dholakia
|
5e5466dfaf
|
fix(proxy_server.py): use tomli instead of tomllib
|
2023-10-11 10:43:53 -07:00 |
|
ishaan-jaff
|
d40aeec131
|
(fix) proxy_server add LiteLLM: Running
|
2023-10-11 08:42:03 -07:00 |
|
Krrish Dholakia
|
ca7e2f6a05
|
docs(proxy_server.md): add docker image details to docs
|
2023-10-11 08:28:08 -07:00 |
|
Krrish Dholakia
|
dfd91c5d46
|
fix(openai-py): fix linting errors
|
2023-10-10 21:56:14 -07:00 |
|
Krrish Dholakia
|
68461f5863
|
fix(openai-py): fix linting issues
|
2023-10-10 21:49:14 -07:00 |
|
Krrish Dholakia
|
208a6d365b
|
fix(openai.py): fix linting errors
|
2023-10-10 21:04:13 -07:00 |
|
Krrish Dholakia
|
d280a8c434
|
fix(proxy_cli-and-utils.py): fixing how config file is read + infering llm_provider for known openai endpoints
|
2023-10-10 20:53:02 -07:00 |
|
Krrish Dholakia
|
1b88868f2c
|
fix(env-template): fixing togetherai api key naming in env template
|
2023-10-10 18:43:42 -07:00 |
|
Krrish Dholakia
|
1df3f349fe
|
style(proxy_cli): additional tracing
|
2023-10-10 18:17:57 -07:00 |
|
Krrish Dholakia
|
661ea2359b
|
refactor(proxy_cli): adding additional tracing
|
2023-10-10 18:12:31 -07:00 |
|
Krrish Dholakia
|
a6f35c8d7a
|
bump: version 0.7.0 → 0.7.1.dev1
|
2023-10-10 18:07:57 -07:00 |
|
Krrish Dholakia
|
0e7b83785b
|
fix(init.py): expose complete client session
|
2023-10-10 15:16:10 -07:00 |
|
Krrish Dholakia
|
b2b724a35c
|
style(proxy_cli.py): adding feedback box
|
2023-10-10 13:49:54 -07:00 |
|
Krrish Dholakia
|
b50013386f
|
fix(openai.py): enable custom proxy to pass in ca_bundle_path
|
2023-10-10 13:23:27 -07:00 |
|
ishaan-jaff
|
7496afdf64
|
(feat) proxy_server add a track_cost_callback for streaming
|
2023-10-10 11:33:08 -07:00 |
|
Krrish Dholakia
|
af2fd0e0de
|
fix: fix value error if model returns empty completion
|
2023-10-10 10:11:40 -07:00 |
|
ishaan-jaff
|
b16cbb069a
|
(test) update supabase logger test
|
2023-10-10 10:05:20 -07:00 |
|
ishaan-jaff
|
f97811fb1c
|
(fix) remove print from supabaseClient
|
2023-10-10 09:59:38 -07:00 |
|
ishaan-jaff
|
228d77d345
|
(fix) identify users in logging
|
2023-10-10 09:56:16 -07:00 |
|
ishaan-jaff
|
5de280c9e5
|
(fix) identify users in callbacks
|
2023-10-10 09:55:57 -07:00 |
|
ishaan-jaff
|
a89d3ed2af
|
(fix) supabase fix upsert bug
|
2023-10-10 09:55:10 -07:00 |
|
ishaan-jaff
|
35b9f34751
|
(fix) supabase callback use litellm.completion_cost
|
2023-10-10 08:58:16 -07:00 |
|
ishaan-jaff
|
e19b4fc114
|
(feat) proxy_server: begin using callback for tracking costs
|
2023-10-10 08:35:31 -07:00 |
|
ishaan-jaff
|
c94ee62bcf
|
(feat) allow messages to be passed in completion_cost
|
2023-10-10 08:35:31 -07:00 |
|
Krrish Dholakia
|
29b2fa7f75
|
fix(test-fixes): test fixes
|
2023-10-10 08:09:42 -07:00 |
|
Krrish Dholakia
|
0d863f00ad
|
refactor(bedrock.py): take model names from model cost dict
|
2023-10-10 07:35:03 -07:00 |
|
Krrish Dholakia
|
db20cb84d4
|
fix(main.py): return n>1 response for openai text completion
|
2023-10-09 20:44:07 -07:00 |
|
Krrish Dholakia
|
41bebaa1e3
|
test(test_bad_params): fix cohere bad params test
|
2023-10-09 20:37:58 -07:00 |
|
Krrish Dholakia
|
689371949c
|
fix(main.py): read openai org from env
|
2023-10-09 16:49:22 -07:00 |
|
Krrish Dholakia
|
253e8d27db
|
fix: bug fix when n>1 passed in
|
2023-10-09 16:46:33 -07:00 |
|
Krrish Dholakia
|
079122fbf1
|
style(utils.py): return better exceptions
https://github.com/BerriAI/litellm/issues/563
|
2023-10-09 15:28:33 -07:00 |
|
Krrish Dholakia
|
a6968d06e6
|
fix(anthropic.py): fix anthropic prompt
|
2023-10-09 15:22:58 -07:00 |
|
ishaan-jaff
|
ba754a07a3
|
(feat) add --cost as a flag to the proxy server cli
|
2023-10-09 15:05:17 -07:00 |
|
Krrish Dholakia
|
42e0d7cf68
|
fix(proxy_server): returns better error messages for invalid api errors
|
2023-10-09 15:03:44 -07:00 |
|
ishaan-jaff
|
262f874621
|
(feat) add cost tracking to proxy server
|
2023-10-09 14:51:37 -07:00 |
|
Krrish Dholakia
|
a9f7a80e3d
|
feat(proxy_cli.py): add max budget to proxy
|
2023-10-09 14:11:30 -07:00 |
|
ishaan-jaff
|
4e64f123ef
|
(fix) api_base, api_version and api_key
|
2023-10-09 14:11:05 -07:00 |
|
ishaan-jaff
|
7587e8ce06
|
(fix) Bedrock test fix, stop tryin to read aws session token
|
2023-10-09 13:49:22 -07:00 |
|
ishaan-jaff
|
8d4c109171
|
(test) add async + non stream test for ollama
|
2023-10-09 13:47:08 -07:00 |
|
ishaan-jaff
|
bf4ce08640
|
(fix) acompletion for ollama non streaing
|
2023-10-09 13:47:08 -07:00 |
|