llama-stack-mirror/scripts/run_openapi_generator.sh
Ashwin Bharambe 7d3db6b22c
feat(openapi): generate stainless config "more" programmatically (#4164)
Generate the Stainless client config directly from code so we can
validate the config before we ever write the YAML.

This change enforces allowed HTTP verbs/paths, detects duplicate routes
across resources, and ensures README example endpoints exist and match
the OpenAPI spec. The generator now fails fast when config entries
drift, keeping the published config (hopefully) more current with the
spec. I think more validation can be done but this is a good start.
2025-11-17 12:48:03 -08:00

21 lines
693 B
Bash
Executable file

#!/usr/bin/env bash
# Copyright (c) Meta Platforms, Inc. and affiliates.
# All rights reserved.
#
# This source code is licensed under the terms described in the LICENSE file in
# the root directory of this source tree.
PYTHONPATH=${PYTHONPATH:-}
THIS_DIR="$(cd "$(dirname "$(readlink -f "${BASH_SOURCE[0]}")")" && pwd)"
set -euo pipefail
stack_dir=$(dirname "$THIS_DIR")
PYTHONPATH=$PYTHONPATH:$stack_dir \
python3 -m scripts.openapi_generator "$stack_dir"/docs/static
cp "$stack_dir"/docs/static/stainless-llama-stack-spec.yaml "$stack_dir"/client-sdks/stainless/openapi.yml
PYTHONPATH=$PYTHONPATH:$stack_dir \
python3 -m scripts.openapi_generator.stainless_config.generate_config