litellm/litellm
2024-04-25 16:39:05 -07:00
..
deprecated_litellm_server refactor: add black formatting 2023-12-25 14:11:20 +05:30
integrations pass alert type on alerting handle 2024-04-25 13:05:34 -07:00
llms fix(vertex_ai.py): handle stream=false 2024-04-25 13:59:37 -07:00
proxy ui - new build 2024-04-25 16:39:05 -07:00
router_strategy fix - increase default penalty for lowest latency 2024-04-25 07:54:25 -07:00
tests Merge pull request #3307 from BerriAI/litellm_set_alerts_per_channel 2024-04-25 16:35:16 -07:00
types fix(proxy_server.py): fix /config/update/ 2024-04-24 16:42:42 -07:00
__init__.py fix - allow users to opt into langfuse default tags 2024-04-19 16:01:27 -07:00
_logging.py fix(parallel_request_limiter.py): handle metadata being none 2024-03-14 10:02:41 -07:00
_redis.py fix(_redis.py): support redis ssl as a kwarg REDIS_SSL 2024-04-20 10:19:44 -07:00
_service_logger.py fix(test_lowest_tpm_rpm_routing_v2.py): unit testing for usage-based-routing-v2 2024-04-18 21:38:00 -07:00
_version.py (fix) ci/cd don't let importing litellm._version block starting proxy 2024-02-01 16:23:16 -08:00
budget_manager.py feat(utils.py): support region based pricing for bedrock + use bedrock's token counts if given 2024-01-26 14:53:58 -08:00
caching.py Merge branch 'main' into litellm_ssl_caching_fix 2024-04-19 17:20:27 -07:00
cost.json store llm costs in budget manager 2023-09-09 19:11:35 -07:00
exceptions.py fix - show api_base, model in exceptions 2024-04-24 14:03:48 -07:00
main.py refactor(main.py): trigger new build 2024-04-24 22:04:24 -07:00
model_prices_and_context_window_backup.json build(model_prices_and_context_window.json): add anthropic tool use system prompt tokens 2024-04-23 20:01:24 -07:00
requirements.txt Add symlink and only copy in source dir to stay under 50MB compressed limit for Lambdas. 2023-11-22 23:07:33 -05:00
router.py Revert "fix(router.py): fix max retries on set_client" 2024-04-24 23:19:14 -07:00
timeout.py refactor: add black formatting 2023-12-25 14:11:20 +05:30
utils.py Merge pull request #3267 from BerriAI/litellm_openai_streaming_fix 2024-04-24 21:08:33 -07:00