mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-10-24 00:47:00 +00:00
# What does this PR do? remove unused chat_completion implementations vllm features ported - - requires max_tokens be set, use config value - set tool_choice to none if no tools provided ## Test Plan ci |
||
|---|---|---|
| .. | ||
| __init__.py | ||
| bedrock.py | ||
| config.py | ||
| models.py | ||