litellm-mirror/litellm
2023-10-10 18:12:31 -07:00
..
__pycache__ style(proxy_cli.py): adding feedback box 2023-10-10 13:49:54 -07:00
integrations fix(openai.py): enable custom proxy to pass in ca_bundle_path 2023-10-10 13:23:27 -07:00
llms fix(init.py): expose complete client session 2023-10-10 15:16:10 -07:00
proxy refactor(proxy_cli): adding additional tracing 2023-10-10 18:12:31 -07:00
tests fix(openai.py): enable custom proxy to pass in ca_bundle_path 2023-10-10 13:23:27 -07:00
.env.template improvements to proxy cli and finish reason mapping for anthropic 2023-09-30 18:09:16 -07:00
__init__.py fix(init.py): expose complete client session 2023-10-10 15:16:10 -07:00
_version.py formatting improvements 2023-08-28 09:20:50 -07:00
budget_manager.py remove budget manager 2023-09-30 11:42:56 -07:00
caching.py add hosted api.litellm.ai for caching 2023-10-02 10:27:18 -07:00
config.json new config.json 2023-09-01 14:16:12 -07:00
cost.json store llm costs in budget manager 2023-09-09 19:11:35 -07:00
exceptions.py add contributor message to code 2023-09-25 10:00:10 -07:00
gpt_cache.py fix caching 2023-08-28 14:53:41 -07:00
main.py fix(openai.py): enable custom proxy to pass in ca_bundle_path 2023-10-10 13:23:27 -07:00
testing.py add contributor message to code 2023-09-25 10:00:10 -07:00
timeout.py fix(completion()): add request_timeout as a param, fix claude error when request_timeout set 2023-10-05 19:05:28 -07:00
utils.py fix: fix value error if model returns empty completion 2023-10-10 10:11:40 -07:00