litellm-mirror/litellm
2023-11-03 22:14:36 -07:00
..
integrations (fix) logging callbacks - promptlayer 2023-11-02 08:18:00 -07:00
llms fix(bedrock.py): add exception mapping coverage for authentication scenarios 2023-11-03 18:25:34 -07:00
proxy refactor(proxy_server.py): print statement showing how to add debug for logs 2023-11-03 17:41:14 -07:00
tests (test) text_completion responses 2023-11-03 22:14:36 -07:00
.env.template fix(env-template): fixing togetherai api key naming in env template 2023-10-10 18:43:42 -07:00
__init__.py refactor(proxy_server.py): print statement showing how to add debug for logs 2023-11-03 17:41:14 -07:00
_version.py formatting improvements 2023-08-28 09:20:50 -07:00
budget_manager.py remove budget manager 2023-09-30 11:42:56 -07:00
caching.py fix(caching.py): fixing pr issues 2023-10-31 18:32:40 -07:00
config.json new config.json 2023-09-01 14:16:12 -07:00
cost.json store llm costs in budget manager 2023-09-09 19:11:35 -07:00
exceptions.py add contributor message to code 2023-09-25 10:00:10 -07:00
gpt_cache.py (fix) cleanup 2023-11-02 14:52:33 -07:00
main.py (feat) text completion response now OpenAI Object 2023-11-03 22:13:52 -07:00
model_prices_and_context_window_backup.json fix(init.py): adding local cached copy of model mapping for fallbacks 2023-10-18 13:59:12 -07:00
router.py Merge pull request #722 from karvetskiy/fix-router-caching 2023-10-31 16:39:18 -07:00
testing.py add contributor message to code 2023-09-25 10:00:10 -07:00
timeout.py fix(completion()): add request_timeout as a param, fix claude error when request_timeout set 2023-10-05 19:05:28 -07:00
utils.py (feat) add TextCompletionResponse 2023-11-03 22:14:07 -07:00