ishaan-jaff
|
57a7c5e659
|
(feat) failure handler - log exceptions when incorrect model passed and result=None
|
2023-10-19 09:11:58 -07:00 |
|
ishaan-jaff
|
de3005ee9e
|
(fix) allow using more than 1 custom callback
|
2023-10-19 09:11:58 -07:00 |
|
Nir Gazit
|
9e9855d814
|
fix: bugs in traceloop integration
|
2023-10-19 17:23:51 +02:00 |
|
ishaan-jaff
|
23cfc009e0
|
(feat) add langsmith logger to litellm
|
2023-10-18 11:39:37 -07:00 |
|
ishaan-jaff
|
e8146b16a4
|
(fix) update docstring for get_max_tokens
|
2023-10-18 09:16:34 -07:00 |
|
ishaan-jaff
|
c67895315d
|
(feat) weights & biases logger
|
2023-10-17 18:01:53 -07:00 |
|
Krrish Dholakia
|
eb05b093ad
|
fix(utils.py): mapping azure api version missing exception
|
2023-10-17 17:12:51 -07:00 |
|
Krrish Dholakia
|
44cafb5bac
|
docs(proxy_server.md): update proxy server docs to include multi-agent autogen tutorial
|
2023-10-17 09:22:34 -07:00 |
|
Krrish Dholakia
|
f221eac41a
|
fix(proxy_server): improve error handling
|
2023-10-16 19:42:53 -07:00 |
|
Zeeland
|
7b1d55d110
|
fix: llm_provider add openai finetune compatibility
|
2023-10-16 18:44:45 +08:00 |
|
ishaan-jaff
|
c0f7c7a001
|
(feat) new function_to_dict litellm.util
|
2023-10-14 18:26:15 -07:00 |
|
ishaan-jaff
|
2cdf0439d1
|
(fix) handle deepinfra/mistral temp for mistral
|
2023-10-14 16:47:25 -07:00 |
|
Krrish Dholakia
|
5f9dd0b21f
|
bump: version 0.8.4 → 0.8.5
|
2023-10-14 16:43:06 -07:00 |
|
ishaan-jaff
|
60f2b32205
|
(feat) add doc string for litellm.utils
|
2023-10-14 16:12:21 -07:00 |
|
Krrish Dholakia
|
5269d0bd5f
|
docs(custom_callback.md): add details on what kwargs are passed to custom callbacks
|
2023-10-14 11:29:26 -07:00 |
|
Krrish Dholakia
|
fa0ff12570
|
fix(utils.py): read env variables for known openai-compatible api's (e.g. perplexity), dynamically from th eenvironment
|
2023-10-13 22:43:32 -07:00 |
|
Krrish Dholakia
|
ec5e7aa4a9
|
fix(openai.p): adding support for exception mapping for openai-compatible apis via http calls
|
2023-10-13 21:56:51 -07:00 |
|
Krrish Dholakia
|
d60518fccf
|
refactor(utils.py): clean up print statement
|
2023-10-13 15:33:12 -07:00 |
|
ishaan-jaff
|
c486bcad91
|
(fix) ensure stop is always a list for anthropic
|
2023-10-12 21:25:18 -07:00 |
|
Krrish Dholakia
|
d0b4dfd26c
|
feat(proxy_server): adds create-proxy feature
|
2023-10-12 18:27:07 -07:00 |
|
ishaan-jaff
|
e77fc7e10d
|
(feat) bedrock add finish_reason to streaming responses
|
2023-10-12 16:22:34 -07:00 |
|
ishaan-jaff
|
a021637e6e
|
(feat) add Rate Limit Error for bedrock
|
2023-10-12 15:57:34 -07:00 |
|
ishaan-jaff
|
797892b679
|
(fix) add bedrock exception mapping for Auth
|
2023-10-12 15:38:09 -07:00 |
|
ishaan-jaff
|
0eeed31652
|
(feat) add ollama exception mapping
|
2023-10-11 17:00:39 -07:00 |
|
Krrish Dholakia
|
382dd0a13d
|
fix(utils): remove ui to view error message
|
2023-10-11 16:01:57 -07:00 |
|
Krrish Dholakia
|
dac3a57095
|
fix(utils): don't wait for thread to complete to return response
|
2023-10-11 14:23:55 -07:00 |
|
ishaan-jaff
|
5c92b9c4c1
|
(feat) upgrade supabase callback + support logging streaming on supabase
|
2023-10-11 12:34:10 -07:00 |
|
Krrish Dholakia
|
87e5f79924
|
fix(proxy_cli-and-utils.py): fixing how config file is read + infering llm_provider for known openai endpoints
|
2023-10-10 20:53:02 -07:00 |
|
Krrish Dholakia
|
cc0e4f4f9f
|
fix: fix value error if model returns empty completion
|
2023-10-10 10:11:40 -07:00 |
|
ishaan-jaff
|
a214a0f328
|
(fix) remove print from supabaseClient
|
2023-10-10 09:59:38 -07:00 |
|
ishaan-jaff
|
303d27923e
|
(fix) identify users in logging
|
2023-10-10 09:56:16 -07:00 |
|
ishaan-jaff
|
0e5997c7ea
|
(fix) identify users in callbacks
|
2023-10-10 09:55:57 -07:00 |
|
ishaan-jaff
|
faaa263cd7
|
(feat) allow messages to be passed in completion_cost
|
2023-10-10 08:35:31 -07:00 |
|
Krrish Dholakia
|
22ee0c6931
|
refactor(bedrock.py): take model names from model cost dict
|
2023-10-10 07:35:03 -07:00 |
|
Krrish Dholakia
|
4a1bd8267f
|
fix: bug fix when n>1 passed in
|
2023-10-09 16:46:33 -07:00 |
|
Krrish Dholakia
|
0f146bce49
|
style(utils.py): return better exceptions
https://github.com/BerriAI/litellm/issues/563
|
2023-10-09 15:28:33 -07:00 |
|
Krrish Dholakia
|
936548db40
|
feat(factory.py): option to add function details to prompt, if model doesn't support functions param
|
2023-10-09 09:53:53 -07:00 |
|
Krrish Dholakia
|
cf7e2595b8
|
fix(utils): adds complete streaming response to success handler
|
2023-10-07 15:42:00 -07:00 |
|
ishaan-jaff
|
b188816a89
|
feat(rate limit aware acompletion calls):
|
2023-10-06 20:48:53 -07:00 |
|
ishaan-jaff
|
e0eedbc183
|
chore(stash rate limit manager changes ):
|
2023-10-06 16:22:02 -07:00 |
|
Krrish Dholakia
|
37d7837b63
|
feat(ollama.py): exposing ollama config
|
2023-10-06 15:52:58 -07:00 |
|
Krrish Dholakia
|
5ab3a4b8d7
|
fix(add-custom-success-callback-for-streaming): add custom success callback for streaming
|
2023-10-06 15:02:02 -07:00 |
|
Krrish Dholakia
|
69cdf5347a
|
style(test_completion.py): fix merge conflict
|
2023-10-05 22:09:38 -07:00 |
|
ishaan-jaff
|
0da3150aa3
|
fix(n param in completion()): fix error thrown when passing n for cohere
|
2023-10-05 19:54:13 -07:00 |
|
ishaan-jaff
|
605369bc2a
|
fix(llmonitor callback): correctly set user_id
|
2023-10-05 19:36:39 -07:00 |
|
ishaan-jaff
|
2e1b02e189
|
fix(completion()): add request_timeout as a param, fix claude error when request_timeout set
|
2023-10-05 19:05:28 -07:00 |
|
Krrish Dholakia
|
1492916a37
|
adding custom prompt templates to ollama
|
2023-10-05 10:48:16 -07:00 |
|
ishaan-jaff
|
b8f6de1289
|
fix linting
|
2023-10-04 16:03:58 -07:00 |
|
ishaan-jaff
|
c8810b3f00
|
make rate limit hadler a class 2
|
2023-10-04 16:03:58 -07:00 |
|
Krish Dholakia
|
5827b30042
|
Merge pull request #530 from vedant-z/patch-1
Update utils.py
|
2023-10-04 15:42:59 -07:00 |
|