mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-10-16 14:57:20 +00:00
14 lines
482 B
Markdown
14 lines
482 B
Markdown
# API Providers
|
|
|
|
A Provider is what makes the API real -- they provide the actual implementation backing the API.
|
|
|
|
As an example, for Inference, we could have the implementation be backed by open source libraries like `[ torch | vLLM | TensorRT ]` as possible options.
|
|
|
|
A provider can also be just a pointer to a remote REST service -- for example, cloud providers or dedicated inference providers could serve these APIs.
|
|
|
|
```{toctree}
|
|
:maxdepth: 1
|
|
|
|
new_api_provider
|
|
memory_api
|
|
```
|