litellm-mirror/litellm
Krrish Dholakia faa78ad543 bump version
2023-09-05 14:30:03 -07:00
..
__pycache__ bump version 2023-09-05 14:30:03 -07:00
integrations update streaming docs to show it working for async completion calls 2023-09-05 09:18:37 -07:00
llms only use tgai's prompt template for llama2 instruct models 2023-09-05 12:25:52 -07:00
tests bump version 2023-09-05 14:30:03 -07:00
.DS_Store version 0.1.2 2023-07-31 08:48:49 -07:00
__init__.py update init with comments 2023-09-05 09:14:57 -07:00
_version.py formatting improvements 2023-08-28 09:20:50 -07:00
caching.py fix redis caching 2023-08-28 22:10:15 -07:00
config.json new config.json 2023-09-01 14:16:12 -07:00
exceptions.py update exception mapping and get model cost map 2023-09-04 11:53:20 -07:00
gpt_cache.py fix caching 2023-08-28 14:53:41 -07:00
main.py bump version 2023-09-05 14:30:03 -07:00
testing.py use api_base instead of custom_api_base 2023-09-02 17:11:30 -07:00
timeout.py mypy linting fixes 2 2023-08-18 11:16:31 -07:00
utils.py adding first-party + custom prompt templates for huggingface 2023-09-04 14:54:09 -07:00