From 537b16a9156283b959a98c93ac722018cc8c7fdd Mon Sep 17 00:00:00 2001 From: Xi Yan Date: Mon, 4 Nov 2024 13:29:10 -0800 Subject: [PATCH] ios image --- .../getting_started/distributions/ondevice_distro/ios_setup.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/source/getting_started/distributions/ondevice_distro/ios_setup.md b/docs/source/getting_started/distributions/ondevice_distro/ios_setup.md index 0acace108..7b4462097 100644 --- a/docs/source/getting_started/distributions/ondevice_distro/ios_setup.md +++ b/docs/source/getting_started/distributions/ondevice_distro/ios_setup.md @@ -5,7 +5,7 @@ We offer both remote and on-device use of Llama Stack in Swift via two component 1. [llama-stack-client-swift](https://github.com/meta-llama/llama-stack-client-swift/) 2. [LocalInferenceImpl](https://github.com/meta-llama/llama-stack/tree/main/llama_stack/providers/impls/ios/inference) -```{image} ../../_static/remote_or_local.gif +```{image} ../../../../_static/remote_or_local.gif :alt: Seamlessly switching between local, on-device inference and remote hosted inference :width: 412px :align: center