mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-12-12 04:00:42 +00:00
fix client-sdks/stainless/openapi.yml API Spec
This commit is contained in:
parent
9605a80250
commit
1928836d97
1 changed files with 0 additions and 1 deletions
|
|
@ -11601,7 +11601,6 @@ components:
|
||||||
description: The sampling strategy.
|
description: The sampling strategy.
|
||||||
max_tokens:
|
max_tokens:
|
||||||
type: integer
|
type: integer
|
||||||
default: 0
|
|
||||||
description: >-
|
description: >-
|
||||||
The maximum number of tokens that can be generated in the completion.
|
The maximum number of tokens that can be generated in the completion.
|
||||||
The token count of your prompt plus max_tokens cannot exceed the model's
|
The token count of your prompt plus max_tokens cannot exceed the model's
|
||||||
|
|
|
||||||
Loading…
Add table
Add a link
Reference in a new issue