mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-08-03 09:21:45 +00:00
changed copy
Signed-off-by: Francisco Javier Arceo <farceo@redhat.com>
This commit is contained in:
parent
59861a4ea5
commit
be112fad4f
1 changed files with 2 additions and 2 deletions
|
@ -220,7 +220,7 @@ Other SDKs are also available, please refer to the [Client SDK](../index.md#clie
|
|||
::::{tab-set}
|
||||
|
||||
:::{tab-item} Basic Inference
|
||||
Alternatively, you can run inference using the Llama Stack client SDK.
|
||||
Now you can run inference using the Llama Stack client SDK.
|
||||
|
||||
### i. Create the Script
|
||||
Create a file `inference.py` and add the following code:
|
||||
|
@ -265,7 +265,7 @@ Beauty in the bits
|
|||
:::
|
||||
|
||||
:::{tab-item} Build a Simple Agent
|
||||
Now we can move beyond simple inference and build an agent that can perform tasks using the Llama Stack server.
|
||||
Next we can move beyond simple inference and build an agent that can perform tasks using the Llama Stack server.
|
||||
### i. Create the Script
|
||||
Create a file `agent.py` and add the following code:
|
||||
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue