llama-stack/llama_stack/templates/llama_api
ehhuang 2603f10f95
feat: support postgresql inference store (#2310)
# What does this PR do?
* Added support postgresql inference store
* Added 'oracle' template that demos how to config postgresql stores
(except for telemetry, which is not supported currently)


## Test Plan

llama stack build --template oracle --image-type conda --run
LLAMA_STACK_CONFIG=http://localhost:8321 pytest -s -v tests/integration/
--text-model accounts/fireworks/models/llama-v3p3-70b-instruct -k
'inference_store'
2025-05-29 14:33:09 -07:00
..
__init__.py feat: add api.llama provider, llama-guard-4 model (#2058) 2025-04-29 10:07:41 -07:00
build.yaml feat: support postgresql inference store (#2310) 2025-05-29 14:33:09 -07:00
llama_api.py fix: remove code interpeter implementation (#2087) 2025-05-01 14:35:08 -07:00
run.yaml feat: add list responses API (#2233) 2025-05-23 13:16:48 -07:00