litellm/litellm
2023-10-07 18:11:03 -07:00
..
__pycache__ test(test_logging): adding print statements for debugging circle ci 2023-10-07 16:20:17 -07:00
integrations fix(llmonitor callback): correctly set user_id 2023-10-05 19:36:39 -07:00
llms style(openai.py): using typing for params 2023-10-07 15:51:16 -07:00
proxy docs(proxy_server): doc cleanup 2023-10-07 17:29:04 -07:00
tests test(test_logging): add back more tests for circle ci 2023-10-07 18:11:03 -07:00
.env.template improvements to proxy cli and finish reason mapping for anthropic 2023-09-30 18:09:16 -07:00
__init__.py docs(completion-docs): adds more details on provider-specific params 2023-10-07 13:49:30 -07:00
_version.py formatting improvements 2023-08-28 09:20:50 -07:00
budget_manager.py remove budget manager 2023-09-30 11:42:56 -07:00
caching.py add hosted api.litellm.ai for caching 2023-10-02 10:27:18 -07:00
config.json new config.json 2023-09-01 14:16:12 -07:00
cost.json store llm costs in budget manager 2023-09-09 19:11:35 -07:00
exceptions.py add contributor message to code 2023-09-25 10:00:10 -07:00
gpt_cache.py fix caching 2023-08-28 14:53:41 -07:00
main.py style(main.py): clean up print statement 2023-10-07 15:43:40 -07:00
testing.py add contributor message to code 2023-09-25 10:00:10 -07:00
timeout.py fix(completion()): add request_timeout as a param, fix claude error when request_timeout set 2023-10-05 19:05:28 -07:00
utils.py fix(utils): adds complete streaming response to success handler 2023-10-07 15:42:00 -07:00