added cloud-local-inference-guide

This commit is contained in:
Justin Lee 2024-11-01 13:39:11 -07:00
parent 46763bc001
commit 1794ebc627
2 changed files with 135 additions and 2 deletions

View file

@ -1,7 +1,7 @@
# Llama Stack Text Generation Guide
This document provides instructions on how to use Llama Stack's `chat_completion` function for generating text using the `Llama3.2-11B-Vision-Instruct` model. Before you begin, please ensure Llama Stack is installed and set up by following the [Getting Started Guide](https://llama-stack-docs.com/getting-started).
This document provides instructions on how to use Llama Stack's `chat_completion` function for generating text using the `Llama3.2-11B-Vision-Instruct` model. Before you begin, please ensure Llama Stack is installed and set up by following the [Getting Started Guide](https://llama-stack.readthedocs.io/en/latest/).
### Table of Contents
1. [Quickstart](#quickstart)
@ -189,4 +189,4 @@ if __name__ == "__main__":
---
With these fundamentals, you should be well on your way to leveraging Llama Stacks text generation capabilities! For more advanced features, refer to the [Llama Stack Documentation](https://llama-stack-docs.com).
With these fundamentals, you should be well on your way to leveraging Llama Stacks text generation capabilities! For more advanced features, refer to the [Llama Stack Documentation](https://llama-stack.readthedocs.io/en/latest/).