forked from phoenix-oss/llama-stack-mirror
docs: add RamaLama to list of known external providers (#2004)
The RamaLama project now has an external provider offering for Llama Stack: https://github.com/containers/llama-stack-provider-ramalama See also: https://github.com/meta-llama/llama-stack/pull/1676 Signed-off-by: Nathan Weinberg <nweinber@redhat.com>
This commit is contained in:
parent
825ce39879
commit
d6e88e0bc6
1 changed files with 1 additions and 0 deletions
|
@ -53,6 +53,7 @@ Here's a list of known external providers that you can use with Llama Stack:
|
|||
| Type | Name | Description | Repository |
|
||||
|------|------|-------------|------------|
|
||||
| Remote | KubeFlow Training | Train models with KubeFlow | [llama-stack-provider-kft](https://github.com/opendatahub-io/llama-stack-provider-kft) |
|
||||
| Remote | RamaLama | Inference models with RamaLama | [llama-stack-provider-ramalama](https://github.com/containers/llama-stack-provider-ramalama) |
|
||||
|
||||
### Remote Provider Specification
|
||||
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue