mirror of
https://github.com/BerriAI/litellm.git
synced 2025-04-25 02:34:29 +00:00
* fix(cost_calculator.py): handle custom pricing at deployment level for router * test: add unit tests * fix(router.py): show custom pricing on UI check correct model str * fix: fix linting error * docs(custom_pricing.md): clarify custom pricing for proxy Fixes https://github.com/BerriAI/litellm/issues/8573#issuecomment-2790420740 * test: update code qa test * fix: cleanup traceback * fix: handle litellm param custom pricing * test: update test * fix(cost_calculator.py): add router model id to list of potential model names * fix(cost_calculator.py): fix router model id check * fix: router.py - maintain older model registry approach * fix: fix ruff check * fix(router.py): router get deployment info add custom values to mapped dict * test: update test * fix(utils.py): update only if value is non-null * test: add unit test |
||
---|---|---|
.. | ||
integrations | ||
llms | ||
mcp_server | ||
passthrough_endpoints | ||
proxy/management_endpoints | ||
adapter.py | ||
caching.py | ||
completion.py | ||
embedding.py | ||
files.py | ||
fine_tuning.py | ||
guardrails.py | ||
rerank.py | ||
router.py | ||
scheduler.py | ||
services.py | ||
tag_management.py | ||
utils.py |