forked from phoenix/litellm-mirror
* add prompt_tokens_details in usage response * use _prompt_tokens_details as a param in Usage * fix linting errors * fix type error * fix ci/cd deps * bump deps for openai * bump deps openai * fix llm translation testing * fix llm translation embedding |
||
---|---|---|
.. | ||
config.yml | ||
requirements.txt |