llama-stack-mirror/tests/integration/eval
Luis Tomas Bolivar 43fb18928b Fix BadRequestError due to unvalid max_tokens
This patch ensures if max tokens is not defined it is set to None.
This avoid some providers to fail, as they don't have protection for
it being set to 0

Issue: #3666
2025-10-03 18:05:02 +02:00
..
__init__.py refactor(test): move tools, evals, datasetio, scoring and post training tests (#1401) 2025-03-04 14:53:47 -08:00
constants.py refactor(test): move tools, evals, datasetio, scoring and post training tests (#1401) 2025-03-04 14:53:47 -08:00
test_eval.py Fix BadRequestError due to unvalid max_tokens 2025-10-03 18:05:02 +02:00