diff --git a/docs/source/chat_local_cloud_guide.md b/docs/source/chat_local_cloud_guide.md index bb7463897..ea2617ecc 100644 --- a/docs/source/chat_local_cloud_guide.md +++ b/docs/source/chat_local_cloud_guide.md @@ -131,3 +131,7 @@ async def get_llama_response(stream: bool = True): asyncio.run(get_llama_response()) ``` + +--- + +With these fundamentals, you should be well on your way to leveraging Llama Stack’s text generation capabilities! For more advanced features, refer to the [Llama Stack Documentation](https://llama-stack.readthedocs.io/en/latest/).