mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-07-26 22:19:49 +00:00
The starter distribution now comes with all the required packages to support persistent stores—like the agent store, metadata, and inference—using PostgreSQL. We’ve added a new run YAML file, run-with-postgres-store.yaml, to make this setup easy. The file is included in the container image, so users can simply override the entrypoint to use the Postgres-specific config. The documentation has been updated with instructions on how to do that. Closes: #2619 Signed-off-by: Sébastien Han <seb@redhat.com> |
||
---|---|---|
.. | ||
ci-tests | ||
dell | ||
meta-reference-gpu | ||
nvidia | ||
open-benchmark | ||
starter | ||
watsonx | ||
__init__.py | ||
template.py |