llama-stack-mirror/llama_stack/cli/stack
Charlie Doern dcc6b1eee9 refactor: install external provider via module
using `module` in the provider class and the fact that `build` and `run` configs BOTH use the `class Provider` now, enables us to point to an external provider via a `module`.

For example, say this is in your build config:

```
- provider_id: ramalama
  provider_type: remote::ramalama
  module: ramalama_stack
```

during build (in the various scripts), additionally to installing any pip dependencies we will also install this module
and use the `get_provider_spec` method to retreive the ProviderSpec that is currently specified using `providers.d`.

Most (if not all) external providers today have a `get_provider_spec` method that sits unused. Utilizing this method rather than the providers.d route allows for a much easier installation process for external providers and limits the amount of extra configuration
a regular user has to do to get their stack off the ground.

In production so far, providing instructions for installing external providers for users has been difficult: they need to install the module as a pre-req, create the providers.d directory, copy in the provider spec, and also copy in the necessary build/run yaml files.

Using the module is a more seamless discovery method

Signed-off-by: Charlie Doern <cdoern@redhat.com>
2025-07-24 18:54:00 -04:00
..
__init__.py API Updates (#73) 2024-09-17 19:51:35 -07:00
_build.py refactor: install external provider via module 2025-07-24 18:54:00 -04:00
build.py feat: --image-type argument overrides value in --config build.yaml (#2179) 2025-05-16 14:45:41 -07:00
list_apis.py API Updates (#73) 2024-09-17 19:51:35 -07:00
list_providers.py chore: more mypy fixes (#2029) 2025-05-06 09:52:31 -07:00
list_stacks.py feat: add llama stack rm command (#2127) 2025-05-21 10:25:51 +02:00
remove.py chore: make cprint write to stderr (#2250) 2025-05-24 23:39:57 -07:00
run.py chore: merge --config and --template in server.py (#2716) 2025-07-21 13:19:27 -07:00
stack.py feat: add llama stack rm command (#2127) 2025-05-21 10:25:51 +02:00
utils.py fix: Use CONDA_DEFAULT_ENV presence as a flag to use conda mode (#1555) 2025-03-27 17:13:22 -04:00