forked from phoenix/litellm-mirror
* add prompt_tokens_details in usage response * use _prompt_tokens_details as a param in Usage * fix linting errors * fix type error * fix ci/cd deps * bump deps for openai * bump deps openai * fix llm translation testing * fix llm translation embedding |
||
---|---|---|
.. | ||
conftest.py | ||
Readme.md | ||
test_anthropic_completion.py | ||
test_databricks.py | ||
test_fireworks_ai_translation.py | ||
test_max_completion_tokens.py | ||
test_nvidia_nim.py | ||
test_openai_o1.py | ||
test_optional_params.py | ||
test_prompt_caching.py | ||
test_supports_vision.py |
More tests under litellm/litellm/tests/*
.