Commit graph

590 commits

Author SHA1 Message Date
Krrish Dholakia
dcb866b353 docs(proxy_server.md): update proxy server docs to include multi-agent autogen tutorial 2023-10-17 09:22:34 -07:00
Krrish Dholakia
541a8b7bc8 fix(proxy_server): improve error handling 2023-10-16 19:42:53 -07:00
Zeeland
9f6138ef0e fix: llm_provider add openai finetune compatibility 2023-10-16 18:44:45 +08:00
ishaan-jaff
7848f1b5b7 (feat) new function_to_dict litellm.util 2023-10-14 18:26:15 -07:00
ishaan-jaff
8f0dd53079 (fix) handle deepinfra/mistral temp for mistral 2023-10-14 16:47:25 -07:00
Krrish Dholakia
7358d2e4ea bump: version 0.8.4 → 0.8.5 2023-10-14 16:43:06 -07:00
ishaan-jaff
97fc44db53 (feat) add doc string for litellm.utils 2023-10-14 16:12:21 -07:00
Krrish Dholakia
d6f2d9b9bb docs(custom_callback.md): add details on what kwargs are passed to custom callbacks 2023-10-14 11:29:26 -07:00
Krrish Dholakia
9513d6b862 fix(utils.py): read env variables for known openai-compatible api's (e.g. perplexity), dynamically from th eenvironment 2023-10-13 22:43:32 -07:00
Krrish Dholakia
91c8e92e71 fix(openai.p): adding support for exception mapping for openai-compatible apis via http calls 2023-10-13 21:56:51 -07:00
Krrish Dholakia
b403bac500 refactor(utils.py): clean up print statement 2023-10-13 15:33:12 -07:00
ishaan-jaff
91e2aebe8c (fix) ensure stop is always a list for anthropic 2023-10-12 21:25:18 -07:00
Krrish Dholakia
b28c055896 feat(proxy_server): adds create-proxy feature 2023-10-12 18:27:07 -07:00
ishaan-jaff
e01d83cea6 (feat) bedrock add finish_reason to streaming responses 2023-10-12 16:22:34 -07:00
ishaan-jaff
66cbba3f55 (feat) add Rate Limit Error for bedrock 2023-10-12 15:57:34 -07:00
ishaan-jaff
640541f2ce (fix) add bedrock exception mapping for Auth 2023-10-12 15:38:09 -07:00
ishaan-jaff
897286ec15 (feat) add ollama exception mapping 2023-10-11 17:00:39 -07:00
Krrish Dholakia
e1ee2890b9 fix(utils): remove ui to view error message 2023-10-11 16:01:57 -07:00
Krrish Dholakia
e26f98dce2 fix(utils): don't wait for thread to complete to return response 2023-10-11 14:23:55 -07:00
ishaan-jaff
cc55bc886a (feat) upgrade supabase callback + support logging streaming on supabase 2023-10-11 12:34:10 -07:00
Krrish Dholakia
d280a8c434 fix(proxy_cli-and-utils.py): fixing how config file is read + infering llm_provider for known openai endpoints 2023-10-10 20:53:02 -07:00
Krrish Dholakia
af2fd0e0de fix: fix value error if model returns empty completion 2023-10-10 10:11:40 -07:00
ishaan-jaff
f97811fb1c (fix) remove print from supabaseClient 2023-10-10 09:59:38 -07:00
ishaan-jaff
228d77d345 (fix) identify users in logging 2023-10-10 09:56:16 -07:00
ishaan-jaff
5de280c9e5 (fix) identify users in callbacks 2023-10-10 09:55:57 -07:00
ishaan-jaff
c94ee62bcf (feat) allow messages to be passed in completion_cost 2023-10-10 08:35:31 -07:00
Krrish Dholakia
0d863f00ad refactor(bedrock.py): take model names from model cost dict 2023-10-10 07:35:03 -07:00
Krrish Dholakia
253e8d27db fix: bug fix when n>1 passed in 2023-10-09 16:46:33 -07:00
Krrish Dholakia
079122fbf1 style(utils.py): return better exceptions
https://github.com/BerriAI/litellm/issues/563
2023-10-09 15:28:33 -07:00
Krrish Dholakia
704be9dcd1 feat(factory.py): option to add function details to prompt, if model doesn't support functions param 2023-10-09 09:53:53 -07:00
Krrish Dholakia
9cda24e1b2 fix(utils): adds complete streaming response to success handler 2023-10-07 15:42:00 -07:00
ishaan-jaff
228d6ea608 feat(rate limit aware acompletion calls): 2023-10-06 20:48:53 -07:00
ishaan-jaff
56c87febae chore(stash rate limit manager changes ): 2023-10-06 16:22:02 -07:00
Krrish Dholakia
306a38880d feat(ollama.py): exposing ollama config 2023-10-06 15:52:58 -07:00
Krrish Dholakia
7e34736a38 fix(add-custom-success-callback-for-streaming): add custom success callback for streaming 2023-10-06 15:02:02 -07:00
Krrish Dholakia
dd7e397650 style(test_completion.py): fix merge conflict 2023-10-05 22:09:38 -07:00
ishaan-jaff
4e6e79b20a fix(n param in completion()): fix error thrown when passing n for cohere 2023-10-05 19:54:13 -07:00
ishaan-jaff
1897a1ee46 fix(llmonitor callback): correctly set user_id 2023-10-05 19:36:39 -07:00
ishaan-jaff
8120477be4 fix(completion()): add request_timeout as a param, fix claude error when request_timeout set 2023-10-05 19:05:28 -07:00
Krrish Dholakia
ed31860206 adding custom prompt templates to ollama 2023-10-05 10:48:16 -07:00
ishaan-jaff
e9160a1485 fix linting 2023-10-04 16:03:58 -07:00
ishaan-jaff
34dc176440 make rate limit hadler a class 2 2023-10-04 16:03:58 -07:00
Krish Dholakia
24c12d6b9b
Merge pull request #530 from vedant-z/patch-1
Update utils.py
2023-10-04 15:42:59 -07:00
ishaan-jaff
f6af10b2ca add batch_completion with rate limits to utils 2023-10-04 14:46:11 -07:00
Vedant Borkar
c21eae7f79
Update utils.py 2023-10-05 03:07:50 +05:30
Krrish Dholakia
95899bf60e add additional param mapping 2023-10-03 21:56:08 -07:00
Krrish Dholakia
851cb86daa add support for ai21 input params 2023-10-03 21:05:28 -07:00
ishaan-jaff
20f990f652 remove print statement 2023-10-03 21:01:22 -07:00
ishaan-jaff
ac2d89aee6 remove junk print statements 2023-10-03 20:58:39 -07:00
Krrish Dholakia
e834c063ff fix n=1 issue with langchain 2023-10-03 11:06:59 -07:00