litellm/litellm
2023-11-06 17:58:06 -08:00
..
integrations fix(utils.py): better exception raising if logging object is not able to get set 2023-11-06 06:34:27 -08:00
llms (fix) hf don't fail when logprob is None 2023-11-06 14:22:09 -08:00
proxy (fix) proxy server - print error msg on exceptions 2023-11-06 17:55:33 -08:00
tests (test) hf inference api - text_completion 2023-11-06 17:56:41 -08:00
.env.template fix(env-template): fixing togetherai api key naming in env template 2023-10-10 18:43:42 -07:00
__init__.py bump: version 0.13.1 → 0.13.2.dev1 2023-11-05 21:12:13 -08:00
_version.py formatting improvements 2023-08-28 09:20:50 -07:00
budget_manager.py refactor(all-files): removing all print statements; adding pre-commit + flake8 to prevent future regressions 2023-11-04 12:50:15 -07:00
caching.py refactor(all-files): removing all print statements; adding pre-commit + flake8 to prevent future regressions 2023-11-04 12:50:15 -07:00
config.json new config.json 2023-09-01 14:16:12 -07:00
cost.json store llm costs in budget manager 2023-09-09 19:11:35 -07:00
exceptions.py add contributor message to code 2023-09-25 10:00:10 -07:00
gpt_cache.py (fix) cleanup 2023-11-02 14:52:33 -07:00
main.py (feat) parallel HF text completion + completion_with_retries show exception 2023-11-06 17:58:06 -08:00
model_prices_and_context_window_backup.json fix(init.py): adding local cached copy of model mapping for fallbacks 2023-10-18 13:59:12 -07:00
router.py Merge pull request #722 from karvetskiy/fix-router-caching 2023-10-31 16:39:18 -07:00
testing.py add contributor message to code 2023-09-25 10:00:10 -07:00
timeout.py (fix) stability imp: completion() timeout during high traffic, should not raise exception 2023-11-06 17:54:35 -08:00
utils.py (fix) HF round up temperature 0 -> 0.01 2023-11-06 14:35:06 -08:00