llama-stack-mirror/tests/integration/providers/test_providers.py
Charlie Doern 436f8ade9e feat: implement provider updating
add `v1/providers/` which uses PUT to allow users to change their provider configuration

this is a follow up to #1429 and related to #1359

a user can call something like:

`llama_stack_client.providers.update(api="inference", provider_id="ollama", provider_type="remote::ollama", config={'url': 'http:/localhost:12345'})`

or

`llama-stack-client providers update inference ollama remote::ollama "{'url': 'http://localhost:12345'}"`

this API works by adding a `RequestMiddleware` to the server which checks requests, and if the user is using PUT /v1/providers, the routes are re-registered with the re-initialized provider configurations/methods

for the client, `self.impls` is updated to hold the proper methods+configurations

this depends on a client PR, the CI will fail until then but succeeded locally

Signed-off-by: Charlie Doern <cdoern@redhat.com>
2025-07-01 10:04:10 -04:00

40 lines
1.3 KiB
Python

# Copyright (c) Meta Platforms, Inc. and affiliates.
# All rights reserved.
#
# This source code is licensed under the terms described in the LICENSE file in
# the root directory of this source tree.
import pytest
from llama_stack_client import LlamaStackClient
from llama_stack import LlamaStackAsLibraryClient
class TestProviders:
@pytest.mark.asyncio
def test_providers(self, llama_stack_client: LlamaStackAsLibraryClient | LlamaStackClient):
provider_list = llama_stack_client.providers.list()
assert provider_list is not None
assert len(provider_list) > 0
for provider in provider_list:
pid = provider.provider_id
provider = llama_stack_client.providers.retrieve(pid)
assert provider is not None
@pytest.mark.asyncio
def test_providers_update(self, llama_stack_client: LlamaStackAsLibraryClient | LlamaStackClient):
new_cfg = {"url": "http://localhost:12345"}
_ = llama_stack_client.providers.retrieve("ollama")
llama_stack_client.providers.update(
api="inference",
provider_id="ollama",
provider_type="remote::ollama",
config=new_cfg,
)
new_provider = llama_stack_client.providers.retrieve("ollama")
assert new_provider.config == new_cfg