From 8a6c0fb93042d995d0ad5b091c52885b1f95fc4b Mon Sep 17 00:00:00 2001 From: Kelly Brown <86735520+kelbrown20@users.noreply.github.com> Date: Thu, 31 Jul 2025 12:21:13 -0400 Subject: [PATCH] docs: Reformat external provider documentation (#2982) MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit **Description** This PR adjusts the external providers documentation to align with the new providers format. Splits up sections into the existing external providers and how to create them as well. Screenshot 2025-07-31 at 9 48 26 AM Open to feedback and adjusting titles --- .../external-providers-guide.md} | 96 ++++++++----------- .../external/external-providers-list.md | 10 ++ docs/source/providers/external/index.md | 13 +++ docs/source/providers/index.md | 2 +- 4 files changed, 65 insertions(+), 56 deletions(-) rename docs/source/providers/{external.md => external/external-providers-guide.md} (85%) create mode 100644 docs/source/providers/external/external-providers-list.md create mode 100644 docs/source/providers/external/index.md diff --git a/docs/source/providers/external.md b/docs/source/providers/external/external-providers-guide.md similarity index 85% rename from docs/source/providers/external.md rename to docs/source/providers/external/external-providers-guide.md index f906890f1..2479d406f 100644 --- a/docs/source/providers/external.md +++ b/docs/source/providers/external/external-providers-guide.md @@ -1,9 +1,4 @@ -# External Providers Guide - -Llama Stack supports external providers that live outside of the main codebase. This allows you to: -- Create and maintain your own providers independently -- Share providers with others without contributing to the main codebase -- Keep provider-specific code separate from the core Llama Stack code +# Creating External Providers ## Configuration @@ -55,17 +50,6 @@ Llama Stack supports two types of external providers: 1. **Remote Providers**: Providers that communicate with external services (e.g., cloud APIs) 2. **Inline Providers**: Providers that run locally within the Llama Stack process -## Known External Providers - -Here's a list of known external providers that you can use with Llama Stack: - -| Name | Description | API | Type | Repository | -|------|-------------|-----|------|------------| -| KubeFlow Training | Train models with KubeFlow | Post Training | Remote | [llama-stack-provider-kft](https://github.com/opendatahub-io/llama-stack-provider-kft) | -| KubeFlow Pipelines | Train models with KubeFlow Pipelines | Post Training | Inline **and** Remote | [llama-stack-provider-kfp-trainer](https://github.com/opendatahub-io/llama-stack-provider-kfp-trainer) | -| RamaLama | Inference models with RamaLama | Inference | Remote | [ramalama-stack](https://github.com/containers/ramalama-stack) | -| TrustyAI LM-Eval | Evaluate models with TrustyAI LM-Eval | Eval | Remote | [llama-stack-provider-lmeval](https://github.com/trustyai-explainability/llama-stack-provider-lmeval) | - ### Remote Provider Specification Remote providers are used when you need to communicate with external services. Here's an example for a custom Ollama provider: @@ -119,9 +103,9 @@ container_image: custom-vector-store:latest # optional - `provider_data_validator`: Optional validator for provider data - `container_image`: Optional container image to use instead of pip packages -## Required Implementation +## Required Fields -## All Providers +### All Providers All providers must contain a `get_provider_spec` function in their `provider` module. This is a standardized structure that Llama Stack expects and is necessary for getting things such as the config class. The `get_provider_spec` method returns a structure identical to the `adapter`. An example function may look like: @@ -146,7 +130,7 @@ def get_provider_spec() -> ProviderSpec: ) ``` -### Remote Providers +#### Remote Providers Remote providers must expose a `get_adapter_impl()` function in their module that takes two arguments: 1. `config`: An instance of the provider's config class @@ -162,7 +146,7 @@ async def get_adapter_impl( return OllamaInferenceAdapter(config) ``` -### Inline Providers +#### Inline Providers Inline providers must expose a `get_provider_impl()` function in their module that takes two arguments: 1. `config`: An instance of the provider's config class @@ -189,7 +173,40 @@ Version: 0.1.0 Location: /path/to/venv/lib/python3.10/site-packages ``` -## Example using `external_providers_dir`: Custom Ollama Provider +## Best Practices + +1. **Package Naming**: Use the prefix `llama-stack-provider-` for your provider packages to make them easily identifiable. + +2. **Version Management**: Keep your provider package versioned and compatible with the Llama Stack version you're using. + +3. **Dependencies**: Only include the minimum required dependencies in your provider package. + +4. **Documentation**: Include clear documentation in your provider package about: + - Installation requirements + - Configuration options + - Usage examples + - Any limitations or known issues + +5. **Testing**: Include tests in your provider package to ensure it works correctly with Llama Stack. +You can refer to the [integration tests +guide](https://github.com/meta-llama/llama-stack/blob/main/tests/integration/README.md) for more +information. Execute the test for the Provider type you are developing. + +## Troubleshooting + +If your external provider isn't being loaded: + +1. Check that `module` points to a published pip package with a top level `provider` module including `get_provider_spec`. +1. Check that the `external_providers_dir` path is correct and accessible. +2. Verify that the YAML files are properly formatted. +3. Ensure all required Python packages are installed. +4. Check the Llama Stack server logs for any error messages - turn on debug logging to get more + information using `LLAMA_STACK_LOGGING=all=debug`. +5. Verify that the provider package is installed in your Python environment if using `external_providers_dir`. + +## Examples + +### Example using `external_providers_dir`: Custom Ollama Provider Here's a complete example of creating and using a custom Ollama provider: @@ -241,7 +258,7 @@ external_providers_dir: ~/.llama/providers.d/ The provider will now be available in Llama Stack with the type `remote::custom_ollama`. -## Example using `module`: ramalama-stack +### Example using `module`: ramalama-stack [ramalama-stack](https://github.com/containers/ramalama-stack) is a recognized external provider that supports installation via module. @@ -266,35 +283,4 @@ additional_pip_packages: No other steps are required other than `llama stack build` and `llama stack run`. The build process will use `module` to install all of the provider dependencies, retrieve the spec, etc. -The provider will now be available in Llama Stack with the type `remote::ramalama`. - -## Best Practices - -1. **Package Naming**: Use the prefix `llama-stack-provider-` for your provider packages to make them easily identifiable. - -2. **Version Management**: Keep your provider package versioned and compatible with the Llama Stack version you're using. - -3. **Dependencies**: Only include the minimum required dependencies in your provider package. - -4. **Documentation**: Include clear documentation in your provider package about: - - Installation requirements - - Configuration options - - Usage examples - - Any limitations or known issues - -5. **Testing**: Include tests in your provider package to ensure it works correctly with Llama Stack. -You can refer to the [integration tests -guide](https://github.com/meta-llama/llama-stack/blob/main/tests/integration/README.md) for more -information. Execute the test for the Provider type you are developing. - -## Troubleshooting - -If your external provider isn't being loaded: - -1. Check that `module` points to a published pip package with a top level `provider` module including `get_provider_spec`. -1. Check that the `external_providers_dir` path is correct and accessible. -2. Verify that the YAML files are properly formatted. -3. Ensure all required Python packages are installed. -4. Check the Llama Stack server logs for any error messages - turn on debug logging to get more - information using `LLAMA_STACK_LOGGING=all=debug`. -5. Verify that the provider package is installed in your Python environment if using `external_providers_dir`. +The provider will now be available in Llama Stack with the type `remote::ramalama`. \ No newline at end of file diff --git a/docs/source/providers/external/external-providers-list.md b/docs/source/providers/external/external-providers-list.md new file mode 100644 index 000000000..49f49076b --- /dev/null +++ b/docs/source/providers/external/external-providers-list.md @@ -0,0 +1,10 @@ +# Known External Providers + +Here's a list of known external providers that you can use with Llama Stack: + +| Name | Description | API | Type | Repository | +|------|-------------|-----|------|------------| +| KubeFlow Training | Train models with KubeFlow | Post Training | Remote | [llama-stack-provider-kft](https://github.com/opendatahub-io/llama-stack-provider-kft) | +| KubeFlow Pipelines | Train models with KubeFlow Pipelines | Post Training | Inline **and** Remote | [llama-stack-provider-kfp-trainer](https://github.com/opendatahub-io/llama-stack-provider-kfp-trainer) | +| RamaLama | Inference models with RamaLama | Inference | Remote | [ramalama-stack](https://github.com/containers/ramalama-stack) | +| TrustyAI LM-Eval | Evaluate models with TrustyAI LM-Eval | Eval | Remote | [llama-stack-provider-lmeval](https://github.com/trustyai-explainability/llama-stack-provider-lmeval) | \ No newline at end of file diff --git a/docs/source/providers/external/index.md b/docs/source/providers/external/index.md new file mode 100644 index 000000000..989a7f5b8 --- /dev/null +++ b/docs/source/providers/external/index.md @@ -0,0 +1,13 @@ +# External Providers + +Llama Stack supports external providers that live outside of the main codebase. This allows you to: +- Create and maintain your own providers independently +- Share providers with others without contributing to the main codebase +- Keep provider-specific code separate from the core Llama Stack code + +```{toctree} +:maxdepth: 1 + +external-providers-list +external-providers-guide +``` \ No newline at end of file diff --git a/docs/source/providers/index.md b/docs/source/providers/index.md index 97971c232..3f66ecd0c 100644 --- a/docs/source/providers/index.md +++ b/docs/source/providers/index.md @@ -15,7 +15,7 @@ Importantly, Llama Stack always strives to provide at least one fully inline pro ```{toctree} :maxdepth: 1 -external +external/index openai inference/index agents/index