mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-12-12 12:06:04 +00:00
This patch ensures if max tokens is not defined it is set to None. This avoid some providers to fail, as they don't have protection for it being set to 0 Issue: #3666 |
||
|---|---|---|
| .. | ||
| recordings | ||
| __init__.py | ||
| constants.py | ||
| test_eval.py | ||