mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-10-17 07:07:19 +00:00
links update
This commit is contained in:
parent
537b16a915
commit
1ba64d822a
4 changed files with 16 additions and 18 deletions
|
@ -34,16 +34,16 @@ Running inference on the underlying Llama model is one of the most critical requ
|
|||
|
||||
- **Do you have access to a machine with powerful GPUs?**
|
||||
If so, we suggest:
|
||||
- [`distribution-meta-reference-gpu`](https://llama-stack.readthedocs.io/en/latest/getting_started/distributions/meta-reference-gpu.html)
|
||||
- [`distribution-meta-reference-gpu`](https://llama-stack.readthedocs.io/en/latest/getting_started/distributions/deployable_distro/meta-reference-gpu.html)
|
||||
- [`distribution-tgi`](https://llama-stack.readthedocs.io/en/latest/getting_started/distributions/tgi.html)
|
||||
|
||||
- **Are you running on a "regular" desktop machine?**
|
||||
If so, we suggest:
|
||||
- [`distribution-ollama`](https://llama-stack.readthedocs.io/en/latest/getting_started/distributions/ollama.html)
|
||||
- [`distribution-ollama`](https://llama-stack.readthedocs.io/en/latest/getting_started/distributions/deployable_distro/ollama.html)
|
||||
|
||||
- **Do you have an API key for a remote inference provider like Fireworks, Together, etc.?** If so, we suggest:
|
||||
- [`distribution-together`](https://llama-stack.readthedocs.io/en/latest/getting_started/distributions/together.html)
|
||||
- [`distribution-fireworks`](https://llama-stack.readthedocs.io/en/latest/getting_started/distributions/fireworks.html)
|
||||
- [`distribution-together`](https://llama-stack.readthedocs.io/en/latest/getting_started/distributions/hosted_distro/together.html)
|
||||
- [`distribution-fireworks`](https://llama-stack.readthedocs.io/en/latest/getting_started/distributions/hosted_distro/fireworks.html)
|
||||
|
||||
|
||||
### Quick Start Commands
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue