mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-12-03 09:53:45 +00:00
updated the description
This commit is contained in:
parent
e172ed3cd7
commit
8942c213dc
1 changed files with 699 additions and 695 deletions
|
|
@ -10,18 +10,6 @@
|
|||
"\n",
|
||||
"This notebook demonstrates how to use **AutoGen (AG2)** with **Llama Stack** as the backend.\n",
|
||||
"\n",
|
||||
"### What is AutoGen?\n",
|
||||
"- Microsoft's framework for **conversational multi-agent** systems\n",
|
||||
"- Emphasizes **chat-based** interactions between agents\n",
|
||||
"- Built-in **code execution** and **human-in-the-loop**\n",
|
||||
"- Great for **interactive problem-solving**\n",
|
||||
"\n",
|
||||
"### Why Llama Stack?\n",
|
||||
"- **Unified backend** for any LLM (Ollama, Together, vLLM, etc.)\n",
|
||||
"- **One integration point** instead of many\n",
|
||||
"- **Production-ready** infrastructure\n",
|
||||
"- **Open-source** and flexible\n",
|
||||
"\n",
|
||||
"### Use Cases Covered:\n",
|
||||
"1. **Two-Agent Conversation** - UserProxy + Assistant solving a problem\n",
|
||||
"2. **Code Generation & Execution** - AutoGen generates and runs code\n",
|
||||
|
|
@ -691,8 +679,24 @@
|
|||
}
|
||||
],
|
||||
"metadata": {
|
||||
"orig_nbformat": 4
|
||||
"kernelspec": {
|
||||
"display_name": "Python 3 (ipykernel)",
|
||||
"language": "python",
|
||||
"name": "python3"
|
||||
},
|
||||
"language_info": {
|
||||
"codemirror_mode": {
|
||||
"name": "ipython",
|
||||
"version": 3
|
||||
},
|
||||
"file_extension": ".py",
|
||||
"mimetype": "text/x-python",
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.12.7"
|
||||
}
|
||||
},
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 2
|
||||
"nbformat_minor": 4
|
||||
}
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue