llama-stack-mirror/llama_stack
Luis Tomas Bolivar 43fb18928b Fix BadRequestError due to unvalid max_tokens
This patch ensures if max tokens is not defined it is set to None.
This avoid some providers to fail, as they don't have protection for
it being set to 0

Issue: #3666
2025-10-03 18:05:02 +02:00
..
apis Fix BadRequestError due to unvalid max_tokens 2025-10-03 18:05:02 +02:00
cli feat: migrate to FIPS-validated cryptographic algorithms (#3423) 2025-09-12 11:18:19 +02:00
core feat(tools)!: substantial clean up of "Tool" related datatypes (#3627) 2025-10-02 15:12:03 -07:00
distributions docs: Fix Dell distro documentation code snippets (#3640) 2025-10-02 11:11:30 +02:00
models feat(tools)!: substantial clean up of "Tool" related datatypes (#3627) 2025-10-02 15:12:03 -07:00
providers feat: implement keyword and hybrid search for Weaviate provider (#3264) 2025-10-03 10:22:30 +02:00
strong_typing feat(files): fix expires_after API shape (#3604) 2025-09-29 21:29:15 -07:00
testing feat: (re-)enable Databricks inference adapter (#3500) 2025-09-23 15:37:23 -04:00
ui fix: Remove deprecated user param in OpenAIResponseObject (#3596) 2025-09-29 13:55:59 -07:00
__init__.py chore(rename): move llama_stack.distribution to llama_stack.core (#2975) 2025-07-30 23:30:53 -07:00
env.py refactor(test): move tools, evals, datasetio, scoring and post training tests (#1401) 2025-03-04 14:53:47 -08:00
log.py feat: auto-detect Console width (#3327) 2025-10-03 10:19:31 +02:00
schema_utils.py feat(api): level inference/rerank and remove experimental (#3565) 2025-09-29 12:42:09 -07:00