fix: Move to older version for docker container failure [fireworks-ai] (#3338)

# What does this PR do?
Noticed the test
https://github.com/llamastack/llama-stack-ops/actions/workflows/test-maybe-cut.yaml
are still failing randomly.

Earlier fixed this with 0.18.0 of fireworks here
https://github.com/llamastack/llama-stack/pull/3267, the local testing
may have inadvertently picked a lower version with `<=` which I assumed
picks latest version.
Now tested with `==` to find the version where it broke and pinning to
version(`<=`) where it was passing.


## Test Plan
Tested locally with the following commands to start a container

Build container
`llama stack build --distro starter --image-type container`
start container `docker run -d -p 8321:8321 --name llama-stack-test
distribution-starter:0.2.20`
check health `http://localhost:8321/v1/health`
Above steps fails without the fix

Tested with `==` to ensure the same version is picked in local testing
instead of anything lower.

Following here for the fix from `fireworks-ai`
1410674695

https://github.com/llamastack/llama-stack/issues/3273
This commit is contained in:
slekkala1 2025-09-04 11:47:46 -07:00 committed by GitHub
parent bcc7f2c7d0
commit 561d2fc6b8
No known key found for this signature in database
GPG key ID: B5690EEEBB952194

View file

@ -116,7 +116,7 @@ def available_providers() -> list[ProviderSpec]:
adapter=AdapterSpec( adapter=AdapterSpec(
adapter_type="fireworks", adapter_type="fireworks",
pip_packages=[ pip_packages=[
"fireworks-ai<=0.18.0", "fireworks-ai<=0.17.16",
], ],
module="llama_stack.providers.remote.inference.fireworks", module="llama_stack.providers.remote.inference.fireworks",
config_class="llama_stack.providers.remote.inference.fireworks.FireworksImplConfig", config_class="llama_stack.providers.remote.inference.fireworks.FireworksImplConfig",