litellm-mirror/litellm
2023-12-23 10:03:49 +05:30
..
deprecated_litellm_server fix(litellm_server): commenting out the code 2023-11-20 15:39:05 -08:00
integrations fix(traceloop.py): add additional openllmetry traces 2023-12-16 19:21:39 -08:00
llms test(test_completion.py-+-test_streaming.py): add ollama endpoint to ci/cd pipeline 2023-12-22 12:21:33 +05:30
proxy fix(proxy_server.py): manage budget at user-level not key-level 2023-12-22 15:10:38 +05:30
router_strategy fix(router.py): fix least-busy routing 2023-12-08 20:29:49 -08:00
tests fix(langsmith.py): fix langsmith streaming logging 2023-12-23 10:02:35 +05:30
__init__.py feat(main.py): add support for image generation endpoint 2023-12-16 21:07:29 -08:00
_logging.py (fix) make print_verbose non blocking 2023-12-07 17:31:32 -08:00
_redis.py fix(_redis.py): check if string before checking os.environ 2023-12-07 15:08:11 -08:00
_version.py formatting improvements 2023-08-28 09:20:50 -07:00
budget_manager.py (fix) make print_verbose non blocking 2023-12-07 17:31:32 -08:00
caching.py feat(router.py): support caching groups 2023-12-15 21:45:51 -08:00
cost.json store llm costs in budget manager 2023-09-09 19:11:35 -07:00
exceptions.py (feat) add openai.NotFoundError 2023-12-15 10:18:02 +05:30
main.py fix(router.py): add support for async image generation endpoints 2023-12-21 14:38:44 +05:30
model_prices_and_context_window_backup.json (fix) update back model prices with latest llms 2023-12-11 10:56:01 -08:00
requirements.txt Add symlink and only copy in source dir to stay under 50MB compressed limit for Lambdas. 2023-11-22 23:07:33 -05:00
router.py fix(proxy_server.py): handle misformatted json body in chat completion request 2023-12-22 12:30:36 +05:30
timeout.py fix(promptlayer.py): fixing promptlayer logging integration 2023-11-13 15:04:15 -08:00
utils.py bump: version 1.15.6 → 1.15.7 2023-12-23 10:03:49 +05:30