mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-12-18 13:59:50 +00:00
# What does this PR do? Sometimes the stream don't have chunks with finish_reason, e.g. canceled stream, which throws a pydantic error as OpenAIChoice.finish_reason: str ## Test Plan observe no more such error when benchmarking |
||
|---|---|---|
| .. | ||
| __init__.py | ||
| datasets.py | ||
| eval_scoring.py | ||
| inference.py | ||
| safety.py | ||
| tool_runtime.py | ||
| vector_io.py | ||