Update new_api_provider.md

This commit is contained in:
Ashwin Bharambe 2024-10-21 14:02:51 -07:00 committed by GitHub
parent 606c48309e
commit 1944405dca
No known key found for this signature in database
GPG key ID: B5690EEEBB952194

View file

@ -5,14 +5,16 @@ This guide contains references to walk you through adding a new API provider.
### Adding a new API provider ### Adding a new API provider
1. First, decide which API your provider falls into (e.g. Inference, Safety, Agents, Memory). 1. First, decide which API your provider falls into (e.g. Inference, Safety, Agents, Memory).
2. Decide whether your provider is a remote provider, or inline implmentation. A remote provider is a provider that makes a remote request to an service. An inline provider is a provider where implementation is executed locally. Checkout the examples, and follow the structure to add your own API provider. Please find the following code pointers: 2. Decide whether your provider is a remote provider, or inline implmentation. A remote provider is a provider that makes a remote request to an service. An inline provider is a provider where implementation is executed locally. Checkout the examples, and follow the structure to add your own API provider. Please find the following code pointers:
- [Inference Remote Adapter](../llama_stack/providers/adapters/inference/) - [Inference Remote Adapter](../llama_stack/providers/adapters/inference/)
- [Inference Inline Provider](../llama_stack/providers/impls/) - [Inference Inline Provider](../llama_stack/providers/impls/)
3. [Build a Llama Stack distribution](./building_distro.md) with your API provider. 3. [Build a Llama Stack distribution](./building_distro.md) with your API provider.
4. Test your code! 4. Test your code!
### Testing your newly added API providers ### Testing your newly added API providers
1. Start with an _integration test_ for your provider. That means we will instantiate the real provider, pass it real configuration and if it is a remote service, we will actually hit the remote service. We **strongly** discourage mocking for these tests at the provider level. Llama Stack is first and foremost about integration so we need to make sure stuff works end-to-end. See [llama_stack/providers/tests/<api>/test_<api>](../llama_stack/providers/tests/inference/test_inference.py) for an example. 1. Start with an _integration test_ for your provider. That means we will instantiate the real provider, pass it real configuration and if it is a remote service, we will actually hit the remote service. We **strongly** discourage mocking for these tests at the provider level. Llama Stack is first and foremost about integration so we need to make sure stuff works end-to-end. See [llama_stack/providers/tests/inference/test_inference.py](../llama_stack/providers/tests/inference/test_inference.py) for an example.
2. In addition, if you want to unit test functionality within your provider, feel free to do so. You can find some tests in `tests/` but they aren't well supported so far. 2. In addition, if you want to unit test functionality within your provider, feel free to do so. You can find some tests in `tests/` but they aren't well supported so far.