litellm/litellm
2024-01-05 15:15:29 -06:00
..
deprecated_litellm_server refactor: add black formatting 2023-12-25 14:11:20 +05:30
integrations (fix) proxy - remove errant print statement 2024-01-01 10:48:12 +05:30
llms Merge branch 'BerriAI:main' into feature_allow_claude_prefill 2024-01-05 15:15:29 -06:00
proxy (ci/cd) add print_verbose for /key/generate 2024-01-05 22:38:46 +05:30
router_strategy fix(lowest_tpm_rpm.py): handle null case for text/message input 2024-01-02 12:24:29 +05:30
tests (ci/cd) proxy:test_add_new_key 2024-01-05 22:53:03 +05:30
types (types) routerConfig 2024-01-02 14:14:29 +05:30
__init__.py (feat) completion_cost - embeddings + raise Exception 2024-01-05 13:11:23 +05:30
_logging.py refactor: add black formatting 2023-12-25 14:11:20 +05:30
_redis.py refactor: add black formatting 2023-12-25 14:11:20 +05:30
_version.py formatting improvements 2023-08-28 09:20:50 -07:00
budget_manager.py refactor: add black formatting 2023-12-25 14:11:20 +05:30
caching.py fix(caching.py): support s-maxage param for cache controls 2024-01-04 11:41:23 +05:30
cost.json store llm costs in budget manager 2023-09-09 19:11:35 -07:00
exceptions.py refactor: add black formatting 2023-12-25 14:11:20 +05:30
main.py fix(caching.py): support ttl, s-max-age, and no-cache cache controls 2024-01-03 12:42:43 +05:30
model_prices_and_context_window_backup.json (fix) update back model prices with latest llms 2023-12-11 10:56:01 -08:00
requirements.txt Add symlink and only copy in source dir to stay under 50MB compressed limit for Lambdas. 2023-11-22 23:07:33 -05:00
router.py fix(router.py): don't retry malformed / content policy violating errors (400 status code) 2024-01-04 22:23:51 +05:30
timeout.py refactor: add black formatting 2023-12-25 14:11:20 +05:30
utils.py (fix) caching use same "created" in response_object 2024-01-05 16:03:56 +05:30