litellm-mirror/litellm
2023-09-13 09:48:49 -07:00
..
__pycache__ return model name as part of streaming object 2023-09-13 09:48:49 -07:00
integrations custom_logger for litellm - callback_func 2023-09-09 18:41:41 -07:00
llms work for hf inference endpoint 2023-09-11 18:37:55 -07:00
tests return model name as part of streaming object 2023-09-13 09:48:49 -07:00
.DS_Store version 0.1.2 2023-07-31 08:48:49 -07:00
__init__.py raise better exception if llm provider isn't passed in or inferred 2023-09-12 11:28:50 -07:00
_version.py formatting improvements 2023-08-28 09:20:50 -07:00
budget_manager.py fix testing 2023-09-12 22:03:51 -07:00
caching.py caching updates 2023-09-08 18:06:47 -07:00
config.json new config.json 2023-09-01 14:16:12 -07:00
cost.json store llm costs in budget manager 2023-09-09 19:11:35 -07:00
exceptions.py fix linting issues 2023-09-06 10:41:52 -07:00
gpt_cache.py fix caching 2023-08-28 14:53:41 -07:00
main.py remove verify_access_key from main 2023-09-12 11:50:30 -07:00
testing.py use api_base instead of custom_api_base 2023-09-02 17:11:30 -07:00
timeout.py fix timeouts + tests + bump v 2023-09-05 17:17:58 -07:00
utils.py return model name as part of streaming object 2023-09-13 09:48:49 -07:00