diff --git a/docs/source/getting_started/developer_cookbook.md b/docs/source/getting_started/developer_cookbook.md index 3aef150a5..152035e9f 100644 --- a/docs/source/getting_started/developer_cookbook.md +++ b/docs/source/getting_started/developer_cookbook.md @@ -26,7 +26,7 @@ Based on your developer needs, below are references to guides to help you get st * Developer Need: I want to use Llama Stack on-Device * Effort: 1.5hr * Guide: - - Please see our [iOS Llama Stack SDK](./ios_setup.md) implementations + - Please see our [iOS Llama Stack SDK](./ios_sdk.md) implementations ### Assemble your own Llama Stack Distribution * Developer Need: I want to assemble my own distribution with API providers to my likings diff --git a/docs/source/getting_started/distributions/ondevice_distro/index.md b/docs/source/getting_started/distributions/ondevice_distro/index.md index cf31719ac..b3228455d 100644 --- a/docs/source/getting_started/distributions/ondevice_distro/index.md +++ b/docs/source/getting_started/distributions/ondevice_distro/index.md @@ -5,5 +5,5 @@ On-device distributions are Llama Stack distributions that run locally on your i ```{toctree} :maxdepth: 1 -ios_setup +ios_sdk ``` diff --git a/docs/source/getting_started/distributions/ondevice_distro/ios_setup.md b/docs/source/getting_started/distributions/ondevice_distro/ios_sdk.md similarity index 99% rename from docs/source/getting_started/distributions/ondevice_distro/ios_setup.md rename to docs/source/getting_started/distributions/ondevice_distro/ios_sdk.md index 7b4462097..08885ad73 100644 --- a/docs/source/getting_started/distributions/ondevice_distro/ios_setup.md +++ b/docs/source/getting_started/distributions/ondevice_distro/ios_sdk.md @@ -1,4 +1,4 @@ -# iOS Setup +# iOS SDK We offer both remote and on-device use of Llama Stack in Swift via two components: diff --git a/docs/source/getting_started/index.md b/docs/source/getting_started/index.md index 5c59f5f9f..c79a6dce7 100644 --- a/docs/source/getting_started/index.md +++ b/docs/source/getting_started/index.md @@ -31,20 +31,20 @@ Running inference on the underlying Llama model is one of the most critical requ - **Do you have access to a machine with powerful GPUs?** If so, we suggest: - - [`distribution-meta-reference-gpu`](https://llama-stack.readthedocs.io/en/latest/getting_started/distributions/self_hosted_distro/meta-reference-gpu.html) - - [`distribution-tgi`](https://llama-stack.readthedocs.io/en/latest/getting_started/distributions/tgi.html) + - [distribution-meta-reference-gpu](https://llama-stack.readthedocs.io/en/latest/getting_started/distributions/self_hosted_distro/meta-reference-gpu.html) + - [distribution-tgi](https://llama-stack.readthedocs.io/en/latest/getting_started/distributions/tgi.html) - **Are you running on a "regular" desktop machine?** If so, we suggest: - - [`distribution-ollama`](https://llama-stack.readthedocs.io/en/latest/getting_started/distributions/self_hosted_distro/ollama.html) + - [distribution-ollama](https://llama-stack.readthedocs.io/en/latest/getting_started/distributions/self_hosted_distro/ollama.html) - **Do you have an API key for a remote inference provider like Fireworks, Together, etc.?** If so, we suggest: - - [`distribution-together`](https://llama-stack.readthedocs.io/en/latest/getting_started/distributions/remote_hosted_distro/together.html) - - [`distribution-fireworks`](https://llama-stack.readthedocs.io/en/latest/getting_started/distributions/remote_hosted_distro/fireworks.html) + - [distribution-together](https://llama-stack.readthedocs.io/en/latest/getting_started/distributions/remote_hosted_distro/together.html) + - [distribution-fireworks](https://llama-stack.readthedocs.io/en/latest/getting_started/distributions/remote_hosted_distro/fireworks.html) -- **Do you have an API key for a remote inference provider like Fireworks, Together, etc.?** If so, we suggest: - - [`distribution-together`](https://llama-stack.readthedocs.io/en/latest/getting_started/distributions/remote_hosted_distro/together.html) - - [`distribution-fireworks`](https://llama-stack.readthedocs.io/en/latest/getting_started/distributions/remote_hosted_distro/fireworks.html) +- **Do you want to run Llama Stack inference on your iOS / Android device** If so, we suggest: + - [iOS](https://llama-stack.readthedocs.io/en/latest/getting_started/distributions/ondevice_distro/ios_sdk.html) + - [Android](https://github.com/meta-llama/llama-stack-client-kotlin) (coming soon) Please see our pages in detail for the types of distributions we offer: diff --git a/docs/source/index.md b/docs/source/index.md index 95ab6258f..c5f339f21 100644 --- a/docs/source/index.md +++ b/docs/source/index.md @@ -86,7 +86,6 @@ You can find more example scripts with client SDKs to talk with the Llama Stack :maxdepth: 3 getting_started/index -getting_started/ios_setup cli_reference/index cli_reference/download_models api_providers/index