Krrish Dholakia
|
2c371bb8d1
|
refactor(test_async_fn.py): refactor for testing
|
2023-10-24 12:46:30 -07:00 |
|
Krrish Dholakia
|
f12dc5df21
|
fix(vertex_ai.py): fix output parsing
|
2023-10-24 12:08:22 -07:00 |
|
Krrish Dholakia
|
653863f787
|
test(test_router.py): fixing router testing
|
2023-10-24 10:21:10 -07:00 |
|
Krrish Dholakia
|
c34e9d73ff
|
fix(openai-proxy/utils.py): adding caching
|
2023-10-23 17:01:03 -07:00 |
|
ishaan-jaff
|
aa6a01904d
|
(fix) get_llm_provider
|
2023-10-20 15:00:21 -07:00 |
|
ishaan-jaff
|
61dd4f167f
|
(feat) native perplexity support
|
2023-10-20 14:29:07 -07:00 |
|
Krrish Dholakia
|
1f1cf7a11c
|
feat(main.py): support multiple deployments in 1 completion call
|
2023-10-20 13:01:53 -07:00 |
|
Krrish Dholakia
|
4eeadd284a
|
feat(utils.py): adding encode and decode functions
|
2023-10-20 11:59:47 -07:00 |
|
Krrish Dholakia
|
4198901a2d
|
docs(token_usage.md): adding new register model function
|
2023-10-19 18:33:53 -07:00 |
|
Krrish Dholakia
|
8dda69e216
|
feat(utils.py): add register model helper function
|
2023-10-19 18:26:36 -07:00 |
|
ishaan-jaff
|
310d65bd62
|
(feat) failure handler - log exceptions when incorrect model passed and result=None
|
2023-10-19 09:11:58 -07:00 |
|
ishaan-jaff
|
5cf3c3dc86
|
(fix) allow using more than 1 custom callback
|
2023-10-19 09:11:58 -07:00 |
|
Nir Gazit
|
c21e86faec
|
fix: bugs in traceloop integration
|
2023-10-19 17:23:51 +02:00 |
|
ishaan-jaff
|
f2d8e43f67
|
(feat) add langsmith logger to litellm
|
2023-10-18 11:39:37 -07:00 |
|
ishaan-jaff
|
0c090e3675
|
(fix) update docstring for get_max_tokens
|
2023-10-18 09:16:34 -07:00 |
|
ishaan-jaff
|
5fd7720029
|
(feat) weights & biases logger
|
2023-10-17 18:01:53 -07:00 |
|
Krrish Dholakia
|
a7c3fc2fd9
|
fix(utils.py): mapping azure api version missing exception
|
2023-10-17 17:12:51 -07:00 |
|
Krrish Dholakia
|
dcb866b353
|
docs(proxy_server.md): update proxy server docs to include multi-agent autogen tutorial
|
2023-10-17 09:22:34 -07:00 |
|
Krrish Dholakia
|
541a8b7bc8
|
fix(proxy_server): improve error handling
|
2023-10-16 19:42:53 -07:00 |
|
Zeeland
|
9f6138ef0e
|
fix: llm_provider add openai finetune compatibility
|
2023-10-16 18:44:45 +08:00 |
|
ishaan-jaff
|
7848f1b5b7
|
(feat) new function_to_dict litellm.util
|
2023-10-14 18:26:15 -07:00 |
|
ishaan-jaff
|
8f0dd53079
|
(fix) handle deepinfra/mistral temp for mistral
|
2023-10-14 16:47:25 -07:00 |
|
Krrish Dholakia
|
7358d2e4ea
|
bump: version 0.8.4 → 0.8.5
|
2023-10-14 16:43:06 -07:00 |
|
ishaan-jaff
|
97fc44db53
|
(feat) add doc string for litellm.utils
|
2023-10-14 16:12:21 -07:00 |
|
Krrish Dholakia
|
d6f2d9b9bb
|
docs(custom_callback.md): add details on what kwargs are passed to custom callbacks
|
2023-10-14 11:29:26 -07:00 |
|
Krrish Dholakia
|
9513d6b862
|
fix(utils.py): read env variables for known openai-compatible api's (e.g. perplexity), dynamically from th eenvironment
|
2023-10-13 22:43:32 -07:00 |
|
Krrish Dholakia
|
91c8e92e71
|
fix(openai.p): adding support for exception mapping for openai-compatible apis via http calls
|
2023-10-13 21:56:51 -07:00 |
|
Krrish Dholakia
|
b403bac500
|
refactor(utils.py): clean up print statement
|
2023-10-13 15:33:12 -07:00 |
|
ishaan-jaff
|
91e2aebe8c
|
(fix) ensure stop is always a list for anthropic
|
2023-10-12 21:25:18 -07:00 |
|
Krrish Dholakia
|
b28c055896
|
feat(proxy_server): adds create-proxy feature
|
2023-10-12 18:27:07 -07:00 |
|
ishaan-jaff
|
e01d83cea6
|
(feat) bedrock add finish_reason to streaming responses
|
2023-10-12 16:22:34 -07:00 |
|
ishaan-jaff
|
66cbba3f55
|
(feat) add Rate Limit Error for bedrock
|
2023-10-12 15:57:34 -07:00 |
|
ishaan-jaff
|
640541f2ce
|
(fix) add bedrock exception mapping for Auth
|
2023-10-12 15:38:09 -07:00 |
|
ishaan-jaff
|
897286ec15
|
(feat) add ollama exception mapping
|
2023-10-11 17:00:39 -07:00 |
|
Krrish Dholakia
|
e1ee2890b9
|
fix(utils): remove ui to view error message
|
2023-10-11 16:01:57 -07:00 |
|
Krrish Dholakia
|
e26f98dce2
|
fix(utils): don't wait for thread to complete to return response
|
2023-10-11 14:23:55 -07:00 |
|
ishaan-jaff
|
cc55bc886a
|
(feat) upgrade supabase callback + support logging streaming on supabase
|
2023-10-11 12:34:10 -07:00 |
|
Krrish Dholakia
|
d280a8c434
|
fix(proxy_cli-and-utils.py): fixing how config file is read + infering llm_provider for known openai endpoints
|
2023-10-10 20:53:02 -07:00 |
|
Krrish Dholakia
|
af2fd0e0de
|
fix: fix value error if model returns empty completion
|
2023-10-10 10:11:40 -07:00 |
|
ishaan-jaff
|
f97811fb1c
|
(fix) remove print from supabaseClient
|
2023-10-10 09:59:38 -07:00 |
|
ishaan-jaff
|
228d77d345
|
(fix) identify users in logging
|
2023-10-10 09:56:16 -07:00 |
|
ishaan-jaff
|
5de280c9e5
|
(fix) identify users in callbacks
|
2023-10-10 09:55:57 -07:00 |
|
ishaan-jaff
|
c94ee62bcf
|
(feat) allow messages to be passed in completion_cost
|
2023-10-10 08:35:31 -07:00 |
|
Krrish Dholakia
|
0d863f00ad
|
refactor(bedrock.py): take model names from model cost dict
|
2023-10-10 07:35:03 -07:00 |
|
Krrish Dholakia
|
253e8d27db
|
fix: bug fix when n>1 passed in
|
2023-10-09 16:46:33 -07:00 |
|
Krrish Dholakia
|
079122fbf1
|
style(utils.py): return better exceptions
https://github.com/BerriAI/litellm/issues/563
|
2023-10-09 15:28:33 -07:00 |
|
Krrish Dholakia
|
704be9dcd1
|
feat(factory.py): option to add function details to prompt, if model doesn't support functions param
|
2023-10-09 09:53:53 -07:00 |
|
Krrish Dholakia
|
9cda24e1b2
|
fix(utils): adds complete streaming response to success handler
|
2023-10-07 15:42:00 -07:00 |
|
ishaan-jaff
|
228d6ea608
|
feat(rate limit aware acompletion calls):
|
2023-10-06 20:48:53 -07:00 |
|
ishaan-jaff
|
56c87febae
|
chore(stash rate limit manager changes ):
|
2023-10-06 16:22:02 -07:00 |
|