llama-stack-mirror/llama_stack
Luis Tomas Bolivar f7243a1e2c Fix BadRequestError due to unvalid max_tokens
This patch ensures if max tokens is not defined it is set to None.
This avoid some providers to fail, as they don't have protection for
it being set to 0

Issue: #3666
2025-10-23 11:22:52 +02:00
..
apis Fix BadRequestError due to unvalid max_tokens 2025-10-23 11:22:52 +02:00
cli fix(logging): move module-level initialization to explicit setup calls (#3874) 2025-10-21 11:08:25 -07:00
core chore(misc): update datasets, benchmarks to use alpha, beta prefixes (#3891) 2025-10-22 15:26:35 -07:00
distributions revert: "chore(cleanup)!: remove tool_runtime.rag_tool" (#3877) 2025-10-21 11:22:06 -07:00
models chore: remove dead code (#3729) 2025-10-07 20:26:02 -07:00
providers fix(responses): sync conversation before yielding terminal events in streaming (#3888) 2025-10-22 14:31:12 -07:00
strong_typing chore: refactor (chat)completions endpoints to use shared params struct (#3761) 2025-10-10 15:46:34 -07:00
testing feat(ci): add support for docker:distro in tests (#3832) 2025-10-16 19:33:13 -07:00
ui build: Bump version to 0.3.0 2025-10-21 23:59:09 +00:00
__init__.py chore(rename): move llama_stack.distribution to llama_stack.core (#2975) 2025-07-30 23:30:53 -07:00
env.py refactor(test): move tools, evals, datasetio, scoring and post training tests (#1401) 2025-03-04 14:53:47 -08:00
log.py fix(logging): ensure logs go to stderr, loggers obey levels (#3885) 2025-10-22 13:06:54 -07:00
schema_utils.py fix(auth): allow unauthenticated access to health and version endpoints (#3736) 2025-10-10 13:41:43 -07:00