llama-stack-mirror/.github/actions
Derek Higgins 7420c1db11 test: Add VLLM provider support to integration tests
- Add setup-vllm GitHub action to start VLLM container
- Extend integration test matrix to support both ollama and vllm providers
- Make test setup conditional based on provider type
- Add provider-specific environment variables and configurations

TODO:
  investigate failing tests for vllm provider (safety, post_training and
tool runtime)
  Also need a proper fix for #2713 (tmp fix for this in the first commit
in this PR)

Closes: #1648

Signed-off-by: Derek Higgins <derekh@redhat.com>
2025-07-24 16:38:03 +01:00
..
setup-ollama ci: do not pull model (#2776) 2025-07-16 04:58:05 -07:00
setup-runner test: add tests against published client (#2752) 2025-07-15 12:25:31 -07:00
setup-vllm test: Add VLLM provider support to integration tests 2025-07-24 16:38:03 +01:00