mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-12-14 19:02:38 +00:00
vllm - - requires max_tokens be set, use config value - set tool_choice to none if no tools provided |
||
|---|---|---|
| .. | ||
| inline | ||
| registry | ||
| remote | ||
| utils | ||
| __init__.py | ||
| datatypes.py | ||