Krrish Dholakia
|
1df3f349fe
|
style(proxy_cli): additional tracing
|
2023-10-10 18:17:57 -07:00 |
|
Krrish Dholakia
|
c5a039be99
|
bump: version 0.7.1.dev1 → 0.7.1.dev2
|
2023-10-10 18:12:36 -07:00 |
|
Krrish Dholakia
|
661ea2359b
|
refactor(proxy_cli): adding additional tracing
|
2023-10-10 18:12:31 -07:00 |
|
Krrish Dholakia
|
a6f35c8d7a
|
bump: version 0.7.0 → 0.7.1.dev1
|
2023-10-10 18:07:57 -07:00 |
|
Krrish Dholakia
|
5a72688648
|
docs(proxy_server): name fix
|
2023-10-10 16:51:38 -07:00 |
|
Krrish Dholakia
|
15afa6cde9
|
docs(proxy_server): adding langdroid tutorial
|
2023-10-10 16:28:58 -07:00 |
|
Krrish Dholakia
|
109826e868
|
docs(proxy_server): fix autogen tutorial docs
|
2023-10-10 16:27:04 -07:00 |
|
Krrish Dholakia
|
0e7b83785b
|
fix(init.py): expose complete client session
|
2023-10-10 15:16:10 -07:00 |
|
Krrish Dholakia
|
b2b724a35c
|
style(proxy_cli.py): adding feedback box
|
2023-10-10 13:49:54 -07:00 |
|
Krrish Dholakia
|
544740bc21
|
bump: version 0.6.6 → 0.7.0
|
2023-10-10 13:23:49 -07:00 |
|
Krrish Dholakia
|
b50013386f
|
fix(openai.py): enable custom proxy to pass in ca_bundle_path
|
2023-10-10 13:23:27 -07:00 |
|
ishaan-jaff
|
7125016d24
|
(docs) custom callback for tracking costs
|
2023-10-10 11:36:02 -07:00 |
|
ishaan-jaff
|
7496afdf64
|
(feat) proxy_server add a track_cost_callback for streaming
|
2023-10-10 11:33:08 -07:00 |
|
ishaan-jaff
|
68b655df51
|
(docs) proxy_server
|
2023-10-10 10:38:10 -07:00 |
|
ishaan-jaff
|
449403e2b3
|
(docs) proxy_server
|
2023-10-10 10:26:44 -07:00 |
|
ishaan-jaff
|
cf580a4995
|
(docs) add cost tracking to proxy server
|
2023-10-10 10:19:15 -07:00 |
|
Krrish Dholakia
|
af2fd0e0de
|
fix: fix value error if model returns empty completion
|
2023-10-10 10:11:40 -07:00 |
|
ishaan-jaff
|
6d81bcc248
|
bump: version 0.6.5 → 0.6.6
|
2023-10-10 10:05:47 -07:00 |
|
ishaan-jaff
|
b16cbb069a
|
(test) update supabase logger test
|
2023-10-10 10:05:20 -07:00 |
|
ishaan-jaff
|
2b8a8297af
|
(docs) fix tracking end users
|
2023-10-10 10:04:47 -07:00 |
|
ishaan-jaff
|
f97811fb1c
|
(fix) remove print from supabaseClient
|
2023-10-10 09:59:38 -07:00 |
|
ishaan-jaff
|
228d77d345
|
(fix) identify users in logging
|
2023-10-10 09:56:16 -07:00 |
|
ishaan-jaff
|
5de280c9e5
|
(fix) identify users in callbacks
|
2023-10-10 09:55:57 -07:00 |
|
ishaan-jaff
|
a89d3ed2af
|
(fix) supabase fix upsert bug
|
2023-10-10 09:55:10 -07:00 |
|
ishaan-jaff
|
35b9f34751
|
(fix) supabase callback use litellm.completion_cost
|
2023-10-10 08:58:16 -07:00 |
|
ishaan-jaff
|
e19b4fc114
|
(feat) proxy_server: begin using callback for tracking costs
|
2023-10-10 08:35:31 -07:00 |
|
ishaan-jaff
|
a27d9ad6bc
|
(docs) add ollama/llama2 llm max_tokens + cost
|
2023-10-10 08:35:31 -07:00 |
|
ishaan-jaff
|
c94ee62bcf
|
(feat) allow messages to be passed in completion_cost
|
2023-10-10 08:35:31 -07:00 |
|
Krrish Dholakia
|
29b2fa7f75
|
fix(test-fixes): test fixes
|
2023-10-10 08:09:42 -07:00 |
|
Krrish Dholakia
|
0d863f00ad
|
refactor(bedrock.py): take model names from model cost dict
|
2023-10-10 07:35:03 -07:00 |
|
Krrish Dholakia
|
152ffca815
|
docs(model-price-json): add bedrock models
|
2023-10-10 07:35:03 -07:00 |
|
Krish Dholakia
|
7a0dc6487b
|
Update README.md
|
2023-10-10 07:15:10 -07:00 |
|
Krrish Dholakia
|
ed832a8111
|
bump: version 0.6.4 → 0.6.5
|
2023-10-09 20:44:15 -07:00 |
|
Krrish Dholakia
|
db20cb84d4
|
fix(main.py): return n>1 response for openai text completion
|
2023-10-09 20:44:07 -07:00 |
|
Krrish Dholakia
|
41bebaa1e3
|
test(test_bad_params): fix cohere bad params test
|
2023-10-09 20:37:58 -07:00 |
|
Krrish Dholakia
|
689371949c
|
fix(main.py): read openai org from env
|
2023-10-09 16:49:22 -07:00 |
|
Krrish Dholakia
|
b9c582184b
|
bump: version 0.6.3 → 0.6.4
|
2023-10-09 16:46:33 -07:00 |
|
Krrish Dholakia
|
253e8d27db
|
fix: bug fix when n>1 passed in
|
2023-10-09 16:46:33 -07:00 |
|
ishaan-jaff
|
2004b449e8
|
(docs) Proxy add tutorial on using multiple llms
|
2023-10-09 16:21:40 -07:00 |
|
Krrish Dholakia
|
b14bda6e1a
|
bump: version 0.6.2 → 0.6.3
|
2023-10-09 15:28:37 -07:00 |
|
Krrish Dholakia
|
079122fbf1
|
style(utils.py): return better exceptions
https://github.com/BerriAI/litellm/issues/563
|
2023-10-09 15:28:33 -07:00 |
|
Krrish Dholakia
|
a6968d06e6
|
fix(anthropic.py): fix anthropic prompt
|
2023-10-09 15:22:58 -07:00 |
|
ishaan-jaff
|
ba754a07a3
|
(feat) add --cost as a flag to the proxy server cli
|
2023-10-09 15:05:17 -07:00 |
|
Krrish Dholakia
|
70720c255e
|
bump: version 0.6.1 → 0.6.2
|
2023-10-09 15:03:44 -07:00 |
|
Krrish Dholakia
|
42e0d7cf68
|
fix(proxy_server): returns better error messages for invalid api errors
|
2023-10-09 15:03:44 -07:00 |
|
ishaan-jaff
|
262f874621
|
(feat) add cost tracking to proxy server
|
2023-10-09 14:51:37 -07:00 |
|
Krrish Dholakia
|
a9f7a80e3d
|
feat(proxy_cli.py): add max budget to proxy
|
2023-10-09 14:11:30 -07:00 |
|
ishaan-jaff
|
4e64f123ef
|
(fix) api_base, api_version and api_key
|
2023-10-09 14:11:05 -07:00 |
|
ishaan-jaff
|
7587e8ce06
|
(fix) Bedrock test fix, stop tryin to read aws session token
|
2023-10-09 13:49:22 -07:00 |
|
ishaan-jaff
|
8d4c109171
|
(test) add async + non stream test for ollama
|
2023-10-09 13:47:08 -07:00 |
|