mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-12-30 21:50:01 +00:00
Fixes: #1955 Since 0.2.0, the vLLM gets an empty list (vs ``None`` in 0.1.9 and before) when there are no tools configured which causes the issue described in #1955. This patch avoids sending the 'tools' param to the vLLM altogether instead of an empty list. It also adds a small unit test to avoid regressions. Signed-off-by: Daniel Alvarez <dalvarez@redhat.com> |
||
|---|---|---|
| .. | ||
| inline | ||
| registry | ||
| remote | ||
| tests | ||
| utils | ||
| __init__.py | ||
| datatypes.py | ||