llama-stack-mirror/docs/static
Luis Tomas Bolivar f7243a1e2c Fix BadRequestError due to unvalid max_tokens
This patch ensures if max tokens is not defined it is set to None.
This avoid some providers to fail, as they don't have protection for
it being set to 0

Issue: #3666
2025-10-23 11:22:52 +02:00
..
img docs: update OG image (#3669) 2025-10-03 10:22:54 -07:00
providers/vector_io docs: static content migration (#3535) 2025-09-24 14:08:50 -07:00
deprecated-llama-stack-spec.html Fix BadRequestError due to unvalid max_tokens 2025-10-23 11:22:52 +02:00
deprecated-llama-stack-spec.yaml Fix BadRequestError due to unvalid max_tokens 2025-10-23 11:22:52 +02:00
experimental-llama-stack-spec.html Fix BadRequestError due to unvalid max_tokens 2025-10-23 11:22:52 +02:00
experimental-llama-stack-spec.yaml Fix BadRequestError due to unvalid max_tokens 2025-10-23 11:22:52 +02:00
llama-stack-spec.html fix(conversations)!: update Conversations API definitions (was: bump openai from 1.107.0 to 2.5.0) (#3847) 2025-10-22 12:32:48 -07:00
llama-stack-spec.yaml fix(conversations)!: update Conversations API definitions (was: bump openai from 1.107.0 to 2.5.0) (#3847) 2025-10-22 12:32:48 -07:00
remote_or_local.gif docs: static content migration (#3535) 2025-09-24 14:08:50 -07:00
safety_system.webp docs: static content migration (#3535) 2025-09-24 14:08:50 -07:00
site.webmanifest docs: add favicon and mobile styling (#3650) 2025-10-02 10:42:54 +02:00
stainless-llama-stack-spec.html Fix BadRequestError due to unvalid max_tokens 2025-10-23 11:22:52 +02:00
stainless-llama-stack-spec.yaml Fix BadRequestError due to unvalid max_tokens 2025-10-23 11:22:52 +02:00