mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-10-04 12:07:34 +00:00
Fix BadRequestError due to unvalid max_tokens
This patch ensures if max tokens is not defined it is set to None. This avoid some providers to fail, as they don't have protection for it being set to 0 Issue: #3666
This commit is contained in:
parent
4dfbe46954
commit
43fb18928b
8 changed files with 3 additions and 7 deletions
|
@ -1927,7 +1927,6 @@ components:
|
|||
description: The sampling strategy.
|
||||
max_tokens:
|
||||
type: integer
|
||||
default: 0
|
||||
description: >-
|
||||
The maximum number of tokens that can be generated in the completion.
|
||||
The token count of your prompt plus max_tokens cannot exceed the model's
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue