mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-12-03 09:53:45 +00:00
# What does this PR do? - Adds a note about unexpected Brave Search output appearing even when Tavily Search is called. This behavior is expected for now and is a work in progress https://github.com/meta-llama/llama-stack/issues/1229. The note aims to clear any confusion for new users. - Adds two example scripts demonstrating how to build an agent using: 1. WebSearch tool 2. WolframAlpha tool These examples provide new users with an instant understanding of how to integrate these tools. [//]: # (If resolving an issue, uncomment and update the line below) [//]: # (Closes #[issue-number]) ## Test Plan Tested these example scripts using following steps: step 1. `ollama run llama3.2:3b-instruct-fp16 --keepalive 60m` step 2. ``` export INFERENCE_MODEL="meta-llama/Llama-3.2-3B-Instruct" export LLAMA_STACK_PORT=8321 ``` step 3: `llama stack run --image-type conda ~/llama-stack/llama_stack/templates/ollama/run.yaml` step 4: run the example script with your api keys. expected output:   [//]: # (## Documentation) |
||
|---|---|---|
| .. | ||
| agent.md | ||
| agent_execution_loop.md | ||
| evals.md | ||
| index.md | ||
| rag.md | ||
| rag.png | ||
| safety.md | ||
| telemetry.md | ||
| tools.md | ||