Organize references

This commit is contained in:
Ashwin Bharambe 2024-11-22 16:46:45 -08:00
parent 6fbf526d5c
commit 6229562760
5 changed files with 17 additions and 13 deletions

View file

@ -1,8 +1,7 @@
# Developer Guide: Adding a New API Provider # Adding a New API Provider
This guide contains references to walk you through adding a new API provider. This guide contains references to walk you through adding a new API provider.
### Adding a new API provider
1. First, decide which API your provider falls into (e.g. Inference, Safety, Agents, Memory). 1. First, decide which API your provider falls into (e.g. Inference, Safety, Agents, Memory).
2. Decide whether your provider is a remote provider, or inline implmentation. A remote provider is a provider that makes a remote request to an service. An inline provider is a provider where implementation is executed locally. Checkout the examples, and follow the structure to add your own API provider. Please find the following code pointers: 2. Decide whether your provider is a remote provider, or inline implmentation. A remote provider is a provider that makes a remote request to an service. An inline provider is a provider where implementation is executed locally. Checkout the examples, and follow the structure to add your own API provider. Please find the following code pointers:
@ -12,7 +11,7 @@ This guide contains references to walk you through adding a new API provider.
3. [Build a Llama Stack distribution](https://llama-stack.readthedocs.io/en/latest/distribution_dev/building_distro.html) with your API provider. 3. [Build a Llama Stack distribution](https://llama-stack.readthedocs.io/en/latest/distribution_dev/building_distro.html) with your API provider.
4. Test your code! 4. Test your code!
### Testing your newly added API providers ## Testing your newly added API providers
1. Start with an _integration test_ for your provider. That means we will instantiate the real provider, pass it real configuration and if it is a remote service, we will actually hit the remote service. We **strongly** discourage mocking for these tests at the provider level. Llama Stack is first and foremost about integration so we need to make sure stuff works end-to-end. See [llama_stack/providers/tests/inference/test_inference.py](../llama_stack/providers/tests/inference/test_inference.py) for an example. 1. Start with an _integration test_ for your provider. That means we will instantiate the real provider, pass it real configuration and if it is a remote service, we will actually hit the remote service. We **strongly** discourage mocking for these tests at the provider level. Llama Stack is first and foremost about integration so we need to make sure stuff works end-to-end. See [llama_stack/providers/tests/inference/test_inference.py](../llama_stack/providers/tests/inference/test_inference.py) for an example.
@ -22,5 +21,6 @@ This guide contains references to walk you through adding a new API provider.
You can find more complex client scripts [llama-stack-apps](https://github.com/meta-llama/llama-stack-apps/tree/main) repo. Note down which scripts works and do not work with your distribution. You can find more complex client scripts [llama-stack-apps](https://github.com/meta-llama/llama-stack-apps/tree/main) repo. Note down which scripts works and do not work with your distribution.
### Submit your PR ## Submit your PR
After you have fully tested your newly added API provider, submit a PR with the attached test plan. You must have a Test Plan in the summary section of your PR. After you have fully tested your newly added API provider, submit a PR with the attached test plan. You must have a Test Plan in the summary section of your PR.

View file

@ -72,7 +72,7 @@ Llama Stack already has a number of "adapters" available for some popular Infere
- Look at [Quick Start](getting_started/index) section to get started with Llama Stack. - Look at [Quick Start](getting_started/index) section to get started with Llama Stack.
- Learn more about [Llama Stack Concepts](concepts/index) to understand how different components fit together. - Learn more about [Llama Stack Concepts](concepts/index) to understand how different components fit together.
- Check out [Zero to Hero](zero_to_hero_guide) guide to learn in details about how to build your first agent. - Check out [Zero to Hero](https://github.com/meta-llama/llama-stack/tree/main/docs/zero_to_hero_guide) guide to learn in details about how to build your first agent.
- See how you can use [Llama Stack Distributions](distributions/index) to get started with popular inference and other service providers. - See how you can use [Llama Stack Distributions](distributions/index) to get started with popular inference and other service providers.
We also provide a number of Client side SDKs to make it easier to connect to Llama Stack server in your preferred language. We also provide a number of Client side SDKs to make it easier to connect to Llama Stack server in your preferred language.
@ -94,4 +94,5 @@ getting_started/index
concepts/index concepts/index
distributions/index distributions/index
contributing/index contributing/index
references/index
``` ```

View file

@ -1,8 +1,11 @@
# References
- [Llama CLI](llama_cli_reference/index) for building and running your Llama Stack server
- [Llama Stack Client CLI](llama_stack_client_cli_reference/index) for interacting with your Llama Stack server
```{toctree} ```{toctree}
:maxdepth: 2 :maxdepth: 2
:hidden:
``` llama_cli_reference/index
llama_stack_client_cli_reference/index
# llama_cli_reference/index llama_cli_reference/download_models
# llama_cli_reference/download_models
# llama_stack_client_cli_reference/index

View file

@ -1,4 +1,4 @@
# llama CLI Reference # llama (server-side) CLI Reference
The `llama` CLI tool helps you setup and use the Llama Stack. It should be available on your path after installing the `llama-stack` package. The `llama` CLI tool helps you setup and use the Llama Stack. It should be available on your path after installing the `llama-stack` package.

View file

@ -1,6 +1,6 @@
# llama-stack-client CLI Reference # llama (client-side) CLI Reference
You may use the `llama-stack-client` to query information about the distribution. The `llama-stack-client` CLI allows you to query information about the distribution.
## Basic Commands ## Basic Commands