mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-12-03 09:53:45 +00:00
OpenAI Responses and Completions have a max_output_tokens field. It is currently missing from the create and response object in Responses API. This PR fixes it. fixes: #3562 Signed-off-by: Abhishek Bongale <abhishekbongale@outlook.com> |
||
|---|---|---|
| .. | ||
| img | ||
| providers/vector_io | ||
| deprecated-llama-stack-spec.yaml | ||
| experimental-llama-stack-spec.yaml | ||
| llama-stack-spec.yaml | ||
| remote_or_local.gif | ||
| safety_system.webp | ||
| site.webmanifest | ||
| stainless-llama-stack-spec.yaml | ||