litellm/litellm
2024-01-08 17:45:00 +05:30
..
deprecated_litellm_server refactor: add black formatting 2023-12-25 14:11:20 +05:30
integrations (fix) proxy - remove errant print statement 2024-01-01 10:48:12 +05:30
llms refactor(gemini.py): fix linting issue 2024-01-08 11:43:33 +05:30
proxy fix(proxy_server.py): improve /health/readiness endpoint to give more details on connected services 2024-01-08 17:45:00 +05:30
router_strategy fix(lowest_tpm_rpm.py): handle null case for text/message input 2024-01-02 12:24:29 +05:30
tests Merge pull request #1356 from BerriAI/litellm_improve_proxy_logs 2024-01-08 14:41:01 +05:30
types (types) routerConfig 2024-01-02 14:14:29 +05:30
__init__.py (feat) completion_cost - embeddings + raise Exception 2024-01-05 13:11:23 +05:30
_logging.py (fix) proxy - show detailed_debug logs 2024-01-08 15:34:24 +05:30
_redis.py refactor: add black formatting 2023-12-25 14:11:20 +05:30
_version.py formatting improvements 2023-08-28 09:20:50 -07:00
budget_manager.py refactor: add black formatting 2023-12-25 14:11:20 +05:30
caching.py Merge pull request #1311 from Manouchehri/patch-5 2024-01-08 09:47:57 +05:30
cost.json store llm costs in budget manager 2023-09-09 19:11:35 -07:00
exceptions.py refactor: add black formatting 2023-12-25 14:11:20 +05:30
main.py fix(main.py): support cost calculation for text completion streaming object 2024-01-08 12:41:43 +05:30
model_prices_and_context_window_backup.json (fix) update back model prices with latest llms 2023-12-11 10:56:01 -08:00
requirements.txt Add symlink and only copy in source dir to stay under 50MB compressed limit for Lambdas. 2023-11-22 23:07:33 -05:00
router.py fix(router.py): azure client init fix 2024-01-08 14:56:57 +05:30
timeout.py refactor: add black formatting 2023-12-25 14:11:20 +05:30
utils.py fix(utils.py): error handling for litellm --model mistral edge case 2024-01-08 15:09:01 +05:30