{
"cells": [
{
"cell_type": "markdown",
"id": "c1e7571c",
"metadata": {
"id": "c1e7571c"
},
"source": [
"[](https://colab.research.google.com/drive/1F2ksmkoGQPa4pzRjMOE6BXWeOxWFIW6n?usp=sharing)\n",
"\n",
"# Llama Stack - Building AI Applications\n",
"\n",
"\n",
"\n",
"[Llama Stack](https://github.com/meta-llama/llama-stack) defines and standardizes the set of core building blocks needed to bring generative AI applications to market. These building blocks are presented in the form of interoperable APIs with a broad set of Service Providers providing their implementations.\n",
"\n",
"Read more about the project: https://llama-stack.readthedocs.io/en/latest/index.html\n",
"\n",
"In this guide, we will showcase how you can build LLM-powered agentic applications using Llama Stack.\n"
]
},
{
"cell_type": "markdown",
"id": "4CV1Q19BDMVw",
"metadata": {
"id": "4CV1Q19BDMVw"
},
"source": [
"## 1. Getting started with Llama Stack"
]
},
{
"cell_type": "markdown",
"id": "K4AvfUAJZOeS",
"metadata": {
"id": "K4AvfUAJZOeS"
},
"source": [
"### 1.1. Create TogetherAI account\n",
"\n",
"\n",
"In order to run inference for the llama models, you will need to use an inference provider. Llama stack supports a number of inference [providers](https://github.com/meta-llama/llama-stack/tree/main/llama_stack/providers/remote/inference).\n",
"\n",
"\n",
"In this showcase, we will use [together.ai](https://www.together.ai/) as the inference provider. So, you would first get an API key from Together if you dont have one already.\n",
"\n",
"Steps [here](https://docs.google.com/document/d/1Vg998IjRW_uujAPnHdQ9jQWvtmkZFt74FldW2MblxPY/edit?usp=sharing).\n",
"\n",
"You can also use Fireworks.ai or even Ollama if you would like to.\n",
"\n",
"\n",
"\n",
"> **Note:** Set the API Key in the Secrets of this notebook\n",
"\n"
]
},
{
"cell_type": "markdown",
"id": "oDUB7M_qe-Gs",
"metadata": {
"id": "oDUB7M_qe-Gs"
},
"source": [
"### 1.2. Install Llama Stack\n",
"\n",
"We will now start with installing the [llama-stack pypi package](https://pypi.org/project/llama-stack).\n",
"\n",
"In addition, we will install [bubblewrap](https://github.com/containers/bubblewrap), a low level light-weight container framework that runs in the user namespace. We will use it to execute code generated by Llama in one of the examples."
]
},
{
"cell_type": "code",
"execution_count": 42,
"id": "J2kGed0R5PSf",
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"collapsed": true,
"id": "J2kGed0R5PSf",
"outputId": "7d543c6f-623d-4911-b9a7-4ed24d5b82f2"
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Reading package lists... Done\n",
"Building dependency tree... Done\n",
"Reading state information... Done\n",
"bubblewrap is already the newest version (0.6.1-1ubuntu0.1).\n",
"0 upgraded, 0 newly installed, 0 to remove and 49 not upgraded.\n",
"Requirement already satisfied: llama-stack in /usr/local/lib/python3.10/dist-packages (0.0.61)\n",
"Requirement already satisfied: blobfile in /usr/local/lib/python3.10/dist-packages (from llama-stack) (3.0.0)\n",
"Requirement already satisfied: fire in /usr/local/lib/python3.10/dist-packages (from llama-stack) (0.7.0)\n",
"Requirement already satisfied: httpx in /usr/local/lib/python3.10/dist-packages (from llama-stack) (0.28.1)\n",
"Requirement already satisfied: huggingface-hub in /usr/local/lib/python3.10/dist-packages (from llama-stack) (0.26.5)\n",
"Requirement already satisfied: llama-models>=0.0.61 in /usr/local/lib/python3.10/dist-packages (from llama-stack) (0.0.61)\n",
"Requirement already satisfied: llama-stack-client>=0.0.61 in /usr/local/lib/python3.10/dist-packages (from llama-stack) (0.0.61)\n",
"Requirement already satisfied: prompt-toolkit in /usr/local/lib/python3.10/dist-packages (from llama-stack) (3.0.48)\n",
"Requirement already satisfied: python-dotenv in /usr/local/lib/python3.10/dist-packages (from llama-stack) (1.0.1)\n",
"Requirement already satisfied: pydantic>=2 in /usr/local/lib/python3.10/dist-packages (from llama-stack) (2.10.3)\n",
"Requirement already satisfied: requests in /usr/local/lib/python3.10/dist-packages (from llama-stack) (2.32.3)\n",
"Requirement already satisfied: rich in /usr/local/lib/python3.10/dist-packages (from llama-stack) (13.9.4)\n",
"Requirement already satisfied: setuptools in /usr/local/lib/python3.10/dist-packages (from llama-stack) (75.1.0)\n",
"Requirement already satisfied: termcolor in /usr/local/lib/python3.10/dist-packages (from llama-stack) (2.5.0)\n",
"Requirement already satisfied: PyYAML in /usr/local/lib/python3.10/dist-packages (from llama-models>=0.0.61->llama-stack) (6.0.2)\n",
"Requirement already satisfied: jinja2 in /usr/local/lib/python3.10/dist-packages (from llama-models>=0.0.61->llama-stack) (3.1.4)\n",
"Requirement already satisfied: tiktoken in /usr/local/lib/python3.10/dist-packages (from llama-models>=0.0.61->llama-stack) (0.8.0)\n",
"Requirement already satisfied: Pillow in /usr/local/lib/python3.10/dist-packages (from llama-models>=0.0.61->llama-stack) (10.4.0)\n",
"Requirement already satisfied: anyio<5,>=3.5.0 in /usr/local/lib/python3.10/dist-packages (from llama-stack-client>=0.0.61->llama-stack) (3.7.1)\n",
"Requirement already satisfied: click in /usr/local/lib/python3.10/dist-packages (from llama-stack-client>=0.0.61->llama-stack) (8.1.7)\n",
"Requirement already satisfied: distro<2,>=1.7.0 in /usr/local/lib/python3.10/dist-packages (from llama-stack-client>=0.0.61->llama-stack) (1.9.0)\n",
"Requirement already satisfied: pandas in /usr/local/lib/python3.10/dist-packages (from llama-stack-client>=0.0.61->llama-stack) (2.2.2)\n",
"Requirement already satisfied: pyaml in /usr/local/lib/python3.10/dist-packages (from llama-stack-client>=0.0.61->llama-stack) (24.12.1)\n",
"Requirement already satisfied: sniffio in /usr/local/lib/python3.10/dist-packages (from llama-stack-client>=0.0.61->llama-stack) (1.3.1)\n",
"Requirement already satisfied: tqdm in /usr/local/lib/python3.10/dist-packages (from llama-stack-client>=0.0.61->llama-stack) (4.66.6)\n",
"Requirement already satisfied: typing-extensions<5,>=4.7 in /usr/local/lib/python3.10/dist-packages (from llama-stack-client>=0.0.61->llama-stack) (4.12.2)\n",
"Requirement already satisfied: certifi in /usr/local/lib/python3.10/dist-packages (from httpx->llama-stack) (2024.8.30)\n",
"Requirement already satisfied: httpcore==1.* in /usr/local/lib/python3.10/dist-packages (from httpx->llama-stack) (1.0.7)\n",
"Requirement already satisfied: idna in /usr/local/lib/python3.10/dist-packages (from httpx->llama-stack) (3.10)\n",
"Requirement already satisfied: h11<0.15,>=0.13 in /usr/local/lib/python3.10/dist-packages (from httpcore==1.*->httpx->llama-stack) (0.14.0)\n",
"Requirement already satisfied: annotated-types>=0.6.0 in /usr/local/lib/python3.10/dist-packages (from pydantic>=2->llama-stack) (0.7.0)\n",
"Requirement already satisfied: pydantic-core==2.27.1 in /usr/local/lib/python3.10/dist-packages (from pydantic>=2->llama-stack) (2.27.1)\n",
"Requirement already satisfied: pycryptodomex>=3.8 in /usr/local/lib/python3.10/dist-packages (from blobfile->llama-stack) (3.21.0)\n",
"Requirement already satisfied: urllib3<3,>=1.25.3 in /usr/local/lib/python3.10/dist-packages (from blobfile->llama-stack) (2.2.3)\n",
"Requirement already satisfied: lxml>=4.9 in /usr/local/lib/python3.10/dist-packages (from blobfile->llama-stack) (5.3.0)\n",
"Requirement already satisfied: filelock>=3.0 in /usr/local/lib/python3.10/dist-packages (from blobfile->llama-stack) (3.16.1)\n",
"Requirement already satisfied: fsspec>=2023.5.0 in /usr/local/lib/python3.10/dist-packages (from huggingface-hub->llama-stack) (2024.9.0)\n",
"Requirement already satisfied: packaging>=20.9 in /usr/local/lib/python3.10/dist-packages (from huggingface-hub->llama-stack) (24.2)\n",
"Requirement already satisfied: wcwidth in /usr/local/lib/python3.10/dist-packages (from prompt-toolkit->llama-stack) (0.2.13)\n",
"Requirement already satisfied: charset-normalizer<4,>=2 in /usr/local/lib/python3.10/dist-packages (from requests->llama-stack) (3.4.0)\n",
"Requirement already satisfied: markdown-it-py>=2.2.0 in /usr/local/lib/python3.10/dist-packages (from rich->llama-stack) (3.0.0)\n",
"Requirement already satisfied: pygments<3.0.0,>=2.13.0 in /usr/local/lib/python3.10/dist-packages (from rich->llama-stack) (2.18.0)\n",
"Requirement already satisfied: exceptiongroup in /usr/local/lib/python3.10/dist-packages (from anyio<5,>=3.5.0->llama-stack-client>=0.0.61->llama-stack) (1.2.2)\n",
"Requirement already satisfied: mdurl~=0.1 in /usr/local/lib/python3.10/dist-packages (from markdown-it-py>=2.2.0->rich->llama-stack) (0.1.2)\n",
"Requirement already satisfied: MarkupSafe>=2.0 in /usr/local/lib/python3.10/dist-packages (from jinja2->llama-models>=0.0.61->llama-stack) (3.0.2)\n",
"Requirement already satisfied: numpy>=1.22.4 in /usr/local/lib/python3.10/dist-packages (from pandas->llama-stack-client>=0.0.61->llama-stack) (1.26.4)\n",
"Requirement already satisfied: python-dateutil>=2.8.2 in /usr/local/lib/python3.10/dist-packages (from pandas->llama-stack-client>=0.0.61->llama-stack) (2.8.2)\n",
"Requirement already satisfied: pytz>=2020.1 in /usr/local/lib/python3.10/dist-packages (from pandas->llama-stack-client>=0.0.61->llama-stack) (2024.2)\n",
"Requirement already satisfied: tzdata>=2022.7 in /usr/local/lib/python3.10/dist-packages (from pandas->llama-stack-client>=0.0.61->llama-stack) (2024.2)\n",
"Requirement already satisfied: regex>=2022.1.18 in /usr/local/lib/python3.10/dist-packages (from tiktoken->llama-models>=0.0.61->llama-stack) (2024.9.11)\n",
"Requirement already satisfied: six>=1.5 in /usr/local/lib/python3.10/dist-packages (from python-dateutil>=2.8.2->pandas->llama-stack-client>=0.0.61->llama-stack) (1.17.0)\n"
]
}
],
"source": [
"!apt-get install -y bubblewrap\n",
"!pip install -U llama-stack"
]
},
{
"cell_type": "markdown",
"id": "414301dc",
"metadata": {
"id": "414301dc"
},
"source": [
"### 1.3. Configure Llama Stack for Together\n",
"\n",
"\n",
"Llama Stack is architected as a collection of lego blocks which can be assembled as needed.\n",
"\n",
"\n",
"Typically, llama stack is available as a server with an endpoint that you can hit. We call this endpoint a [Distribution](https://llama-stack.readthedocs.io/en/latest/concepts/index.html#distributions). Partners like Together and Fireworks offer their own Llama Stack Distribution endpoints.\n",
"\n",
"In this showcase, we are going to use llama stack inline as a library. So, given a particular set of providers, we must first package up the right set of dependencies. We have a template to use Together as an inference provider and [faiss](https://ai.meta.com/tools/faiss/) for memory/RAG.\n",
"\n",
"We will run `llama stack build` to deploy all dependencies."
]
},
{
"cell_type": "code",
"execution_count": 43,
"id": "HaepEZXCDgif",
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"collapsed": true,
"id": "HaepEZXCDgif",
"outputId": "9c268d26-7444-4741-f14d-3911eea8e4eb"
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Requirement already satisfied: llama-stack in /usr/local/lib/python3.10/dist-packages (0.0.61)\r\n",
"Requirement already satisfied: blobfile in /usr/local/lib/python3.10/dist-packages (from llama-stack) (3.0.0)\r\n",
"Requirement already satisfied: fire in /usr/local/lib/python3.10/dist-packages (from llama-stack) (0.7.0)\r\n",
"Requirement already satisfied: httpx in /usr/local/lib/python3.10/dist-packages (from llama-stack) (0.28.1)\r\n",
"Requirement already satisfied: huggingface-hub in /usr/local/lib/python3.10/dist-packages (from llama-stack) (0.26.5)\r\n",
"Requirement already satisfied: llama-models>=0.0.61 in /usr/local/lib/python3.10/dist-packages (from llama-stack) (0.0.61)\r\n",
"Requirement already satisfied: llama-stack-client>=0.0.61 in /usr/local/lib/python3.10/dist-packages (from llama-stack) (0.0.61)\r\n",
"Requirement already satisfied: prompt-toolkit in /usr/local/lib/python3.10/dist-packages (from llama-stack) (3.0.48)\r\n",
"Requirement already satisfied: python-dotenv in /usr/local/lib/python3.10/dist-packages (from llama-stack) (1.0.1)\r\n",
"Requirement already satisfied: pydantic>=2 in /usr/local/lib/python3.10/dist-packages (from llama-stack) (2.10.3)\r\n",
"Requirement already satisfied: requests in /usr/local/lib/python3.10/dist-packages (from llama-stack) (2.32.3)\r\n",
"Requirement already satisfied: rich in /usr/local/lib/python3.10/dist-packages (from llama-stack) (13.9.4)\r\n",
"Requirement already satisfied: setuptools in /usr/local/lib/python3.10/dist-packages (from llama-stack) (75.1.0)\r\n",
"Requirement already satisfied: termcolor in /usr/local/lib/python3.10/dist-packages (from llama-stack) (2.5.0)\r\n",
"Requirement already satisfied: PyYAML in /usr/local/lib/python3.10/dist-packages (from llama-models>=0.0.61->llama-stack) (6.0.2)\r\n",
"Requirement already satisfied: jinja2 in /usr/local/lib/python3.10/dist-packages (from llama-models>=0.0.61->llama-stack) (3.1.4)\r\n",
"Requirement already satisfied: tiktoken in /usr/local/lib/python3.10/dist-packages (from llama-models>=0.0.61->llama-stack) (0.8.0)\r\n",
"Requirement already satisfied: Pillow in /usr/local/lib/python3.10/dist-packages (from llama-models>=0.0.61->llama-stack) (10.4.0)\r\n",
"Requirement already satisfied: anyio<5,>=3.5.0 in /usr/local/lib/python3.10/dist-packages (from llama-stack-client>=0.0.61->llama-stack) (3.7.1)\r\n",
"Requirement already satisfied: click in /usr/local/lib/python3.10/dist-packages (from llama-stack-client>=0.0.61->llama-stack) (8.1.7)\r\n",
"Requirement already satisfied: distro<2,>=1.7.0 in /usr/local/lib/python3.10/dist-packages (from llama-stack-client>=0.0.61->llama-stack) (1.9.0)\r\n",
"Requirement already satisfied: pandas in /usr/local/lib/python3.10/dist-packages (from llama-stack-client>=0.0.61->llama-stack) (2.2.2)\r\n",
"Requirement already satisfied: pyaml in /usr/local/lib/python3.10/dist-packages (from llama-stack-client>=0.0.61->llama-stack) (24.12.1)\r\n",
"Requirement already satisfied: sniffio in /usr/local/lib/python3.10/dist-packages (from llama-stack-client>=0.0.61->llama-stack) (1.3.1)\r\n",
"Requirement already satisfied: tqdm in /usr/local/lib/python3.10/dist-packages (from llama-stack-client>=0.0.61->llama-stack) (4.66.6)\r\n",
"Requirement already satisfied: typing-extensions<5,>=4.7 in /usr/local/lib/python3.10/dist-packages (from llama-stack-client>=0.0.61->llama-stack) (4.12.2)\r\n",
"Requirement already satisfied: certifi in /usr/local/lib/python3.10/dist-packages (from httpx->llama-stack) (2024.8.30)\r\n",
"Requirement already satisfied: httpcore==1.* in /usr/local/lib/python3.10/dist-packages (from httpx->llama-stack) (1.0.7)\r\n",
"Requirement already satisfied: idna in /usr/local/lib/python3.10/dist-packages (from httpx->llama-stack) (3.10)\r\n",
"Requirement already satisfied: h11<0.15,>=0.13 in /usr/local/lib/python3.10/dist-packages (from httpcore==1.*->httpx->llama-stack) (0.14.0)\r\n",
"Requirement already satisfied: annotated-types>=0.6.0 in /usr/local/lib/python3.10/dist-packages (from pydantic>=2->llama-stack) (0.7.0)\r\n",
"Requirement already satisfied: pydantic-core==2.27.1 in /usr/local/lib/python3.10/dist-packages (from pydantic>=2->llama-stack) (2.27.1)\r\n",
"Requirement already satisfied: pycryptodomex>=3.8 in /usr/local/lib/python3.10/dist-packages (from blobfile->llama-stack) (3.21.0)\r\n",
"Requirement already satisfied: urllib3<3,>=1.25.3 in /usr/local/lib/python3.10/dist-packages (from blobfile->llama-stack) (2.2.3)\r\n",
"Requirement already satisfied: lxml>=4.9 in /usr/local/lib/python3.10/dist-packages (from blobfile->llama-stack) (5.3.0)\r\n",
"Requirement already satisfied: filelock>=3.0 in /usr/local/lib/python3.10/dist-packages (from blobfile->llama-stack) (3.16.1)\n",
"Requirement already satisfied: fsspec>=2023.5.0 in /usr/local/lib/python3.10/dist-packages (from huggingface-hub->llama-stack) (2024.9.0)\n",
"Requirement already satisfied: packaging>=20.9 in /usr/local/lib/python3.10/dist-packages (from huggingface-hub->llama-stack) (24.2)\n",
"Requirement already satisfied: wcwidth in /usr/local/lib/python3.10/dist-packages (from prompt-toolkit->llama-stack) (0.2.13)\n",
"Requirement already satisfied: charset-normalizer<4,>=2 in /usr/local/lib/python3.10/dist-packages (from requests->llama-stack) (3.4.0)\n",
"Requirement already satisfied: markdown-it-py>=2.2.0 in /usr/local/lib/python3.10/dist-packages (from rich->llama-stack) (3.0.0)\n",
"Requirement already satisfied: pygments<3.0.0,>=2.13.0 in /usr/local/lib/python3.10/dist-packages (from rich->llama-stack) (2.18.0)\n",
"Requirement already satisfied: exceptiongroup in /usr/local/lib/python3.10/dist-packages (from anyio<5,>=3.5.0->llama-stack-client>=0.0.61->llama-stack) (1.2.2)\n",
"Requirement already satisfied: mdurl~=0.1 in /usr/local/lib/python3.10/dist-packages (from markdown-it-py>=2.2.0->rich->llama-stack) (0.1.2)\n",
"Requirement already satisfied: MarkupSafe>=2.0 in /usr/local/lib/python3.10/dist-packages (from jinja2->llama-models>=0.0.61->llama-stack) (3.0.2)\n",
"Requirement already satisfied: numpy>=1.22.4 in /usr/local/lib/python3.10/dist-packages (from pandas->llama-stack-client>=0.0.61->llama-stack) (1.26.4)\n",
"Requirement already satisfied: python-dateutil>=2.8.2 in /usr/local/lib/python3.10/dist-packages (from pandas->llama-stack-client>=0.0.61->llama-stack) (2.8.2)\n",
"Requirement already satisfied: pytz>=2020.1 in /usr/local/lib/python3.10/dist-packages (from pandas->llama-stack-client>=0.0.61->llama-stack) (2024.2)\n",
"Requirement already satisfied: tzdata>=2022.7 in /usr/local/lib/python3.10/dist-packages (from pandas->llama-stack-client>=0.0.61->llama-stack) (2024.2)\n",
"Requirement already satisfied: regex>=2022.1.18 in /usr/local/lib/python3.10/dist-packages (from tiktoken->llama-models>=0.0.61->llama-stack) (2024.9.11)\n",
"Requirement already satisfied: six>=1.5 in /usr/local/lib/python3.10/dist-packages (from python-dateutil>=2.8.2->pandas->llama-stack-client>=0.0.61->llama-stack) (1.17.0)\n",
"Installing pip dependencies\n",
"Requirement already satisfied: pillow in /usr/local/lib/python3.10/dist-packages (10.4.0)\n",
"Requirement already satisfied: transformers in /usr/local/lib/python3.10/dist-packages (4.46.3)\n",
"Requirement already satisfied: psycopg2-binary in /usr/local/lib/python3.10/dist-packages (2.9.10)\n",
"Requirement already satisfied: aiosqlite in /usr/local/lib/python3.10/dist-packages (0.20.0)\n",
"Requirement already satisfied: tqdm in /usr/local/lib/python3.10/dist-packages (4.66.6)\n",
"Requirement already satisfied: pypdf in /usr/local/lib/python3.10/dist-packages (5.1.0)\n",
"Requirement already satisfied: numpy in /usr/local/lib/python3.10/dist-packages (1.26.4)\n",
"Requirement already satisfied: scikit-learn in /usr/local/lib/python3.10/dist-packages (1.5.2)\n",
"Requirement already satisfied: redis in /usr/local/lib/python3.10/dist-packages (5.2.1)\n",
"Requirement already satisfied: opentelemetry-sdk in /usr/local/lib/python3.10/dist-packages (1.28.2)\n",
"Requirement already satisfied: sentencepiece in /usr/local/lib/python3.10/dist-packages (0.2.0)\n",
"Requirement already satisfied: blobfile in /usr/local/lib/python3.10/dist-packages (3.0.0)\n",
"Requirement already satisfied: together in /usr/local/lib/python3.10/dist-packages (1.3.5)\n",
"Requirement already satisfied: openai in /usr/local/lib/python3.10/dist-packages (1.54.5)\n",
"Requirement already satisfied: faiss-cpu in /usr/local/lib/python3.10/dist-packages (1.9.0.post1)\n",
"Requirement already satisfied: autoevals in /usr/local/lib/python3.10/dist-packages (0.0.110)\n",
"Requirement already satisfied: chardet in /usr/local/lib/python3.10/dist-packages (5.2.0)\n",
"Requirement already satisfied: nltk in /usr/local/lib/python3.10/dist-packages (3.9.1)\n",
"Requirement already satisfied: pandas in /usr/local/lib/python3.10/dist-packages (2.2.2)\n",
"Requirement already satisfied: opentelemetry-exporter-otlp-proto-http in /usr/local/lib/python3.10/dist-packages (1.28.2)\n",
"Requirement already satisfied: datasets in /usr/local/lib/python3.10/dist-packages (3.2.0)\n",
"Requirement already satisfied: matplotlib in /usr/local/lib/python3.10/dist-packages (3.8.0)\n",
"Requirement already satisfied: scipy in /usr/local/lib/python3.10/dist-packages (1.13.1)\n",
"Requirement already satisfied: chromadb-client in /usr/local/lib/python3.10/dist-packages (0.5.23)\n",
"Requirement already satisfied: fastapi in /usr/local/lib/python3.10/dist-packages (0.115.6)\n",
"Requirement already satisfied: fire in /usr/local/lib/python3.10/dist-packages (0.7.0)\n",
"Requirement already satisfied: httpx in /usr/local/lib/python3.10/dist-packages (0.28.1)\n",
"Requirement already satisfied: uvicorn in /usr/local/lib/python3.10/dist-packages (0.32.1)\n",
"Requirement already satisfied: filelock in /usr/local/lib/python3.10/dist-packages (from transformers) (3.16.1)\n",
"Requirement already satisfied: huggingface-hub<1.0,>=0.23.2 in /usr/local/lib/python3.10/dist-packages (from transformers) (0.26.5)\n",
"Requirement already satisfied: packaging>=20.0 in /usr/local/lib/python3.10/dist-packages (from transformers) (24.2)\n",
"Requirement already satisfied: pyyaml>=5.1 in /usr/local/lib/python3.10/dist-packages (from transformers) (6.0.2)\n",
"Requirement already satisfied: regex!=2019.12.17 in /usr/local/lib/python3.10/dist-packages (from transformers) (2024.9.11)\n",
"Requirement already satisfied: requests in /usr/local/lib/python3.10/dist-packages (from transformers) (2.32.3)\n",
"Requirement already satisfied: tokenizers<0.21,>=0.20 in /usr/local/lib/python3.10/dist-packages (from transformers) (0.20.3)\n",
"Requirement already satisfied: safetensors>=0.4.1 in /usr/local/lib/python3.10/dist-packages (from transformers) (0.4.5)\n",
"Requirement already satisfied: typing_extensions>=4.0 in /usr/local/lib/python3.10/dist-packages (from aiosqlite) (4.12.2)\n",
"Requirement already satisfied: joblib>=1.2.0 in /usr/local/lib/python3.10/dist-packages (from scikit-learn) (1.4.2)\n",
"Requirement already satisfied: threadpoolctl>=3.1.0 in /usr/local/lib/python3.10/dist-packages (from scikit-learn) (3.5.0)\n",
"Requirement already satisfied: async-timeout>=4.0.3 in /usr/local/lib/python3.10/dist-packages (from redis) (4.0.3)\n",
"Requirement already satisfied: opentelemetry-api==1.28.2 in /usr/local/lib/python3.10/dist-packages (from opentelemetry-sdk) (1.28.2)\n",
"Requirement already satisfied: opentelemetry-semantic-conventions==0.49b2 in /usr/local/lib/python3.10/dist-packages (from opentelemetry-sdk) (0.49b2)\n",
"Requirement already satisfied: deprecated>=1.2.6 in /usr/local/lib/python3.10/dist-packages (from opentelemetry-api==1.28.2->opentelemetry-sdk) (1.2.15)\n",
"Requirement already satisfied: importlib-metadata<=8.5.0,>=6.0 in /usr/local/lib/python3.10/dist-packages (from opentelemetry-api==1.28.2->opentelemetry-sdk) (8.5.0)\n",
"Requirement already satisfied: pycryptodomex>=3.8 in /usr/local/lib/python3.10/dist-packages (from blobfile) (3.21.0)\n",
"Requirement already satisfied: urllib3<3,>=1.25.3 in /usr/local/lib/python3.10/dist-packages (from blobfile) (2.2.3)\n",
"Requirement already satisfied: lxml>=4.9 in /usr/local/lib/python3.10/dist-packages (from blobfile) (5.3.0)\n",
"Requirement already satisfied: aiohttp<4.0.0,>=3.9.3 in /usr/local/lib/python3.10/dist-packages (from together) (3.11.10)\n",
"Requirement already satisfied: click<9.0.0,>=8.1.7 in /usr/local/lib/python3.10/dist-packages (from together) (8.1.7)\n",
"Requirement already satisfied: eval-type-backport<0.3.0,>=0.1.3 in /usr/local/lib/python3.10/dist-packages (from together) (0.2.0)\n",
"Requirement already satisfied: pyarrow>=10.0.1 in /usr/local/lib/python3.10/dist-packages (from together) (17.0.0)\n",
"Requirement already satisfied: pydantic<3.0.0,>=2.6.3 in /usr/local/lib/python3.10/dist-packages (from together) (2.10.3)\n",
"Requirement already satisfied: rich<14.0.0,>=13.8.1 in /usr/local/lib/python3.10/dist-packages (from together) (13.9.4)\n",
"Requirement already satisfied: tabulate<0.10.0,>=0.9.0 in /usr/local/lib/python3.10/dist-packages (from together) (0.9.0)\n",
"Requirement already satisfied: typer<0.14,>=0.9 in /usr/local/lib/python3.10/dist-packages (from together) (0.13.1)\n",
"Requirement already satisfied: anyio<5,>=3.5.0 in /usr/local/lib/python3.10/dist-packages (from openai) (3.7.1)\n",
"Requirement already satisfied: distro<2,>=1.7.0 in /usr/local/lib/python3.10/dist-packages (from openai) (1.9.0)\n",
"Requirement already satisfied: jiter<1,>=0.4.0 in /usr/local/lib/python3.10/dist-packages (from openai) (0.8.2)\n",
"Requirement already satisfied: sniffio in /usr/local/lib/python3.10/dist-packages (from openai) (1.3.1)\n",
"Requirement already satisfied: chevron in /usr/local/lib/python3.10/dist-packages (from autoevals) (0.14.0)\n",
"Requirement already satisfied: levenshtein in /usr/local/lib/python3.10/dist-packages (from autoevals) (0.26.1)\n",
"Requirement already satisfied: braintrust_core==0.0.54 in /usr/local/lib/python3.10/dist-packages (from autoevals) (0.0.54)\n",
"Requirement already satisfied: jsonschema in /usr/local/lib/python3.10/dist-packages (from autoevals) (4.23.0)\n",
"Requirement already satisfied: python-dateutil>=2.8.2 in /usr/local/lib/python3.10/dist-packages (from pandas) (2.8.2)\n",
"Requirement already satisfied: pytz>=2020.1 in /usr/local/lib/python3.10/dist-packages (from pandas) (2024.2)\n",
"Requirement already satisfied: tzdata>=2022.7 in /usr/local/lib/python3.10/dist-packages (from pandas) (2024.2)\n",
"Requirement already satisfied: googleapis-common-protos~=1.52 in /usr/local/lib/python3.10/dist-packages (from opentelemetry-exporter-otlp-proto-http) (1.66.0)\n",
"Requirement already satisfied: opentelemetry-exporter-otlp-proto-common==1.28.2 in /usr/local/lib/python3.10/dist-packages (from opentelemetry-exporter-otlp-proto-http) (1.28.2)\n",
"Requirement already satisfied: opentelemetry-proto==1.28.2 in /usr/local/lib/python3.10/dist-packages (from opentelemetry-exporter-otlp-proto-http) (1.28.2)\n",
"Requirement already satisfied: protobuf<6.0,>=5.0 in /usr/local/lib/python3.10/dist-packages (from opentelemetry-proto==1.28.2->opentelemetry-exporter-otlp-proto-http) (5.29.1)\n",
"Requirement already satisfied: dill<0.3.9,>=0.3.0 in /usr/local/lib/python3.10/dist-packages (from datasets) (0.3.8)\n",
"Requirement already satisfied: xxhash in /usr/local/lib/python3.10/dist-packages (from datasets) (3.5.0)\n",
"Requirement already satisfied: multiprocess<0.70.17 in /usr/local/lib/python3.10/dist-packages (from datasets) (0.70.16)\n",
"Requirement already satisfied: fsspec<=2024.9.0,>=2023.1.0 in /usr/local/lib/python3.10/dist-packages (from fsspec[http]<=2024.9.0,>=2023.1.0->datasets) (2024.9.0)\n",
"Requirement already satisfied: contourpy>=1.0.1 in /usr/local/lib/python3.10/dist-packages (from matplotlib) (1.3.1)\n",
"Requirement already satisfied: cycler>=0.10 in /usr/local/lib/python3.10/dist-packages (from matplotlib) (0.12.1)\n",
"Requirement already satisfied: fonttools>=4.22.0 in /usr/local/lib/python3.10/dist-packages (from matplotlib) (4.55.3)\n",
"Requirement already satisfied: kiwisolver>=1.0.1 in /usr/local/lib/python3.10/dist-packages (from matplotlib) (1.4.7)\n",
"Requirement already satisfied: pyparsing>=2.3.1 in /usr/local/lib/python3.10/dist-packages (from matplotlib) (3.2.0)\n",
"Requirement already satisfied: opentelemetry-exporter-otlp-proto-grpc>=1.2.0 in /usr/local/lib/python3.10/dist-packages (from chromadb-client) (1.28.2)\n",
"Requirement already satisfied: overrides>=7.3.1 in /usr/local/lib/python3.10/dist-packages (from chromadb-client) (7.7.0)\n",
"Requirement already satisfied: posthog>=2.4.0 in /usr/local/lib/python3.10/dist-packages (from chromadb-client) (3.7.4)\n",
"Requirement already satisfied: tenacity>=8.2.3 in /usr/local/lib/python3.10/dist-packages (from chromadb-client) (9.0.0)\n",
"Requirement already satisfied: orjson>=3.9.12 in /usr/local/lib/python3.10/dist-packages (from chromadb-client) (3.10.12)\n",
"Requirement already satisfied: starlette<0.42.0,>=0.40.0 in /usr/local/lib/python3.10/dist-packages (from fastapi) (0.41.3)\n",
"Requirement already satisfied: termcolor in /usr/local/lib/python3.10/dist-packages (from fire) (2.5.0)\n",
"Requirement already satisfied: certifi in /usr/local/lib/python3.10/dist-packages (from httpx) (2024.8.30)\n",
"Requirement already satisfied: httpcore==1.* in /usr/local/lib/python3.10/dist-packages (from httpx) (1.0.7)\n",
"Requirement already satisfied: idna in /usr/local/lib/python3.10/dist-packages (from httpx) (3.10)\n",
"Requirement already satisfied: h11<0.15,>=0.13 in /usr/local/lib/python3.10/dist-packages (from httpcore==1.*->httpx) (0.14.0)\n",
"Requirement already satisfied: aiohappyeyeballs>=2.3.0 in /usr/local/lib/python3.10/dist-packages (from aiohttp<4.0.0,>=3.9.3->together) (2.4.4)\n",
"Requirement already satisfied: aiosignal>=1.1.2 in /usr/local/lib/python3.10/dist-packages (from aiohttp<4.0.0,>=3.9.3->together) (1.3.1)\n",
"Requirement already satisfied: attrs>=17.3.0 in /usr/local/lib/python3.10/dist-packages (from aiohttp<4.0.0,>=3.9.3->together) (24.2.0)\n",
"Requirement already satisfied: frozenlist>=1.1.1 in /usr/local/lib/python3.10/dist-packages (from aiohttp<4.0.0,>=3.9.3->together) (1.5.0)\n",
"Requirement already satisfied: multidict<7.0,>=4.5 in /usr/local/lib/python3.10/dist-packages (from aiohttp<4.0.0,>=3.9.3->together) (6.1.0)\n",
"Requirement already satisfied: propcache>=0.2.0 in /usr/local/lib/python3.10/dist-packages (from aiohttp<4.0.0,>=3.9.3->together) (0.2.1)\n",
"Requirement already satisfied: yarl<2.0,>=1.17.0 in /usr/local/lib/python3.10/dist-packages (from aiohttp<4.0.0,>=3.9.3->together) (1.18.3)\n",
"Requirement already satisfied: exceptiongroup in /usr/local/lib/python3.10/dist-packages (from anyio<5,>=3.5.0->openai) (1.2.2)\n",
"Requirement already satisfied: wrapt<2,>=1.10 in /usr/local/lib/python3.10/dist-packages (from deprecated>=1.2.6->opentelemetry-api==1.28.2->opentelemetry-sdk) (1.17.0)\n",
"Requirement already satisfied: grpcio<2.0.0,>=1.63.2 in /usr/local/lib/python3.10/dist-packages (from opentelemetry-exporter-otlp-proto-grpc>=1.2.0->chromadb-client) (1.68.1)\n",
"Requirement already satisfied: six>=1.5 in /usr/local/lib/python3.10/dist-packages (from posthog>=2.4.0->chromadb-client) (1.17.0)\n",
"Requirement already satisfied: monotonic>=1.5 in /usr/local/lib/python3.10/dist-packages (from posthog>=2.4.0->chromadb-client) (1.6)\n",
"Requirement already satisfied: backoff>=1.10.0 in /usr/local/lib/python3.10/dist-packages (from posthog>=2.4.0->chromadb-client) (2.2.1)\n",
"Requirement already satisfied: annotated-types>=0.6.0 in /usr/local/lib/python3.10/dist-packages (from pydantic<3.0.0,>=2.6.3->together) (0.7.0)\n",
"Requirement already satisfied: pydantic-core==2.27.1 in /usr/local/lib/python3.10/dist-packages (from pydantic<3.0.0,>=2.6.3->together) (2.27.1)\n",
"Requirement already satisfied: charset-normalizer<4,>=2 in /usr/local/lib/python3.10/dist-packages (from requests->transformers) (3.4.0)\n",
"Requirement already satisfied: markdown-it-py>=2.2.0 in /usr/local/lib/python3.10/dist-packages (from rich<14.0.0,>=13.8.1->together) (3.0.0)\n",
"Requirement already satisfied: pygments<3.0.0,>=2.13.0 in /usr/local/lib/python3.10/dist-packages (from rich<14.0.0,>=13.8.1->together) (2.18.0)\n",
"Requirement already satisfied: shellingham>=1.3.0 in /usr/local/lib/python3.10/dist-packages (from typer<0.14,>=0.9->together) (1.5.4)\n",
"Requirement already satisfied: jsonschema-specifications>=2023.03.6 in /usr/local/lib/python3.10/dist-packages (from jsonschema->autoevals) (2024.10.1)\n",
"Requirement already satisfied: referencing>=0.28.4 in /usr/local/lib/python3.10/dist-packages (from jsonschema->autoevals) (0.35.1)\n",
"Requirement already satisfied: rpds-py>=0.7.1 in /usr/local/lib/python3.10/dist-packages (from jsonschema->autoevals) (0.22.3)\n",
"Requirement already satisfied: rapidfuzz<4.0.0,>=3.9.0 in /usr/local/lib/python3.10/dist-packages (from levenshtein->autoevals) (3.10.1)\n",
"Requirement already satisfied: zipp>=3.20 in /usr/local/lib/python3.10/dist-packages (from importlib-metadata<=8.5.0,>=6.0->opentelemetry-api==1.28.2->opentelemetry-sdk) (3.21.0)\n",
"Requirement already satisfied: mdurl~=0.1 in /usr/local/lib/python3.10/dist-packages (from markdown-it-py>=2.2.0->rich<14.0.0,>=13.8.1->together) (0.1.2)\n",
"sentence-transformers --no-deps\n",
"Requirement already satisfied: sentence-transformers in /usr/local/lib/python3.10/dist-packages (3.2.1)\n",
"torch --index-url https://download.pytorch.org/whl/cpu\n",
"Looking in indexes: https://download.pytorch.org/whl/cpu\n",
"Requirement already satisfied: torch in /usr/local/lib/python3.10/dist-packages (2.5.1+cu121)\n",
"Requirement already satisfied: filelock in /usr/local/lib/python3.10/dist-packages (from torch) (3.16.1)\n",
"Requirement already satisfied: typing-extensions>=4.8.0 in /usr/local/lib/python3.10/dist-packages (from torch) (4.12.2)\n",
"Requirement already satisfied: networkx in /usr/local/lib/python3.10/dist-packages (from torch) (3.4.2)\n",
"Requirement already satisfied: jinja2 in /usr/local/lib/python3.10/dist-packages (from torch) (3.1.4)\n",
"Requirement already satisfied: fsspec in /usr/local/lib/python3.10/dist-packages (from torch) (2024.9.0)\n",
"Requirement already satisfied: sympy==1.13.1 in /usr/local/lib/python3.10/dist-packages (from torch) (1.13.1)\n",
"Requirement already satisfied: mpmath<1.4,>=1.1.0 in /usr/local/lib/python3.10/dist-packages (from sympy==1.13.1->torch) (1.3.0)\n",
"Requirement already satisfied: MarkupSafe>=2.0 in /usr/local/lib/python3.10/dist-packages (from jinja2->torch) (3.0.2)\n",
"\u001b[32mBuild Successful!\u001b[0m\n"
]
}
],
"source": [
"# This will build all the dependencies you will need\n",
"!llama stack build --template together --image-type venv"
]
},
{
"cell_type": "markdown",
"id": "25b97dfe",
"metadata": {
"id": "25b97dfe"
},
"source": [
"### 1.4. Initialize Llama Stack\n",
"\n",
"Now that all dependencies have been installed, we can initialize llama stack. We will first set the `TOGETHER_API_KEY` environment variable\n"
]
},
{
"cell_type": "code",
"execution_count": 1,
"id": "E1UFuJC570Tk",
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/",
"height": 1000
},
"collapsed": true,
"id": "E1UFuJC570Tk",
"outputId": "bac7c9ec-ad49-4040-af43-8869f0afe5ac"
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"\u001b[33mWarning: `bwrap` is not available. Code interpreter tool will not work correctly.\u001b[0m\n"
]
},
{
"data": {
"text/html": [
"
Using config /Users/dineshyv/.llama/distributions/llamastack-together/together-run.yaml:\n",
"
\n"
],
"text/plain": [
"Using config \u001b[34m/Users/dineshyv/.llama/distributions/llamastack-together/\u001b[0m\u001b[34mtogether-run.yaml\u001b[0m:\n"
]
},
"metadata": {},
"output_type": "display_data"
},
{
"data": {
"text/html": [
"apis:\n", "- agents\n", "- datasetio\n", "- eval\n", "- inference\n", "- memory\n", "- safety\n", "- scoring\n", "- telemetry\n", "- tool_runtime\n", "conda_env: together\n", "datasets: []\n", "docker_image: null\n", "eval_tasks: []\n", "image_name: together\n", "memory_banks: []\n", "metadata_store:\n", " db_path: /Users/dineshyv/.llama/distributions/together/registry.db\n", " namespace: null\n", " type: sqlite\n", "models:\n", "- metadata: {}\n", " model_id: meta-llama/Llama-3.1-8B-Instruct\n", " model_type: !!python/object/apply:llama_stack.apis.models.models.ModelType\n", " - llm\n", " provider_id: together\n", " provider_model_id: meta-llama/Meta-Llama-3.1-8B-Instruct-Turbo\n", "- metadata: {}\n", " model_id: meta-llama/Llama-3.1-70B-Instruct\n", " model_type: !!python/object/apply:llama_stack.apis.models.models.ModelType\n", " - llm\n", " provider_id: together\n", " provider_model_id: meta-llama/Meta-Llama-3.1-70B-Instruct-Turbo\n", "- metadata: {}\n", " model_id: meta-llama/Llama-3.1-405B-Instruct-FP8\n", " model_type: !!python/object/apply:llama_stack.apis.models.models.ModelType\n", " - llm\n", " provider_id: together\n", " provider_model_id: meta-llama/Meta-Llama-3.1-405B-Instruct-Turbo\n", "- metadata: {}\n", " model_id: meta-llama/Llama-3.2-3B-Instruct\n", " model_type: !!python/object/apply:llama_stack.apis.models.models.ModelType\n", " - llm\n", " provider_id: together\n", " provider_model_id: meta-llama/Llama-3.2-3B-Instruct-Turbo\n", "- metadata: {}\n", " model_id: meta-llama/Llama-3.2-11B-Vision-Instruct\n", " model_type: !!python/object/apply:llama_stack.apis.models.models.ModelType\n", " - llm\n", " provider_id: together\n", " provider_model_id: meta-llama/Llama-3.2-11B-Vision-Instruct-Turbo\n", "- metadata: {}\n", " model_id: meta-llama/Llama-3.2-90B-Vision-Instruct\n", " model_type: !!python/object/apply:llama_stack.apis.models.models.ModelType\n", " - llm\n", " provider_id: together\n", " provider_model_id: meta-llama/Llama-3.2-90B-Vision-Instruct-Turbo\n", "- metadata: {}\n", " model_id: meta-llama/Llama-Guard-3-8B\n", " model_type: !!python/object/apply:llama_stack.apis.models.models.ModelType\n", " - llm\n", " provider_id: together\n", " provider_model_id: meta-llama/Meta-Llama-Guard-3-8B\n", "- metadata: {}\n", " model_id: meta-llama/Llama-Guard-3-11B-Vision\n", " model_type: !!python/object/apply:llama_stack.apis.models.models.ModelType\n", " - llm\n", " provider_id: together\n", " provider_model_id: meta-llama/Llama-Guard-3-11B-Vision-Turbo\n", "- metadata:\n", " embedding_dimension: 384\n", " model_id: all-MiniLM-L6-v2\n", " model_type: !!python/object/apply:llama_stack.apis.models.models.ModelType\n", " - embedding\n", " provider_id: sentence-transformers\n", " provider_model_id: null\n", "providers:\n", " agents:\n", " - config:\n", " persistence_store:\n", " db_path: /Users/dineshyv/.llama/distributions/together/agents_store.db\n", " namespace: null\n", " type: sqlite\n", " provider_id: meta-reference\n", " provider_type: inline::meta-reference\n", " datasetio:\n", " - config: {}\n", " provider_id: huggingface\n", " provider_type: remote::huggingface\n", " - config: {}\n", " provider_id: localfs\n", " provider_type: inline::localfs\n", " eval:\n", " - config: {}\n", " provider_id: meta-reference\n", " provider_type: inline::meta-reference\n", " inference:\n", " - config:\n", " api_key: '********'\n", " url: https://api.together.xyz/v1\n", " provider_id: together\n", " provider_type: remote::together\n", " - config: {}\n", " provider_id: sentence-transformers\n", " provider_type: inline::sentence-transformers\n", " memory:\n", " - config:\n", " kvstore:\n", " db_path: /Users/dineshyv/.llama/distributions/together/faiss_store.db\n", " namespace: null\n", " type: sqlite\n", " provider_id: faiss\n", " provider_type: inline::faiss\n", " safety:\n", " - config: {}\n", " provider_id: llama-guard\n", " provider_type: inline::llama-guard\n", " scoring:\n", " - config: {}\n", " provider_id: basic\n", " provider_type: inline::basic\n", " - config: {}\n", " provider_id: llm-as-judge\n", " provider_type: inline::llm-as-judge\n", " - config:\n", " openai_api_key: '********'\n", " provider_id: braintrust\n", " provider_type: inline::braintrust\n", " telemetry:\n", " - config:\n", " service_name: llama-stack\n", " sinks: sqlite\n", " sqlite_db_path: /Users/dineshyv/.llama/distributions/together/trace_store.db\n", " provider_id: meta-reference\n", " provider_type: inline::meta-reference\n", " tool_runtime:\n", " - config:\n", " api_key: '********'\n", " provider_id: brave-search\n", " provider_type: remote::brave-search\n", " - config:\n", " api_key: '********'\n", " provider_id: tavily-search\n", " provider_type: remote::tavily-search\n", " - config: {}\n", " provider_id: code-interpreter\n", " provider_type: inline::code-interpreter\n", " - config: {}\n", " provider_id: memory-runtime\n", " provider_type: inline::memory-runtime\n", "scoring_fns: []\n", "shields:\n", "- params: null\n", " provider_id: null\n", " provider_shield_id: null\n", " shield_id: meta-llama/Llama-Guard-3-8B\n", "tool_groups:\n", "- provider_id: tavily-search\n", " tool_group:\n", " tools:\n", " - built_in_type: !!python/object/apply:llama_models.llama3.api.datatypes.BuiltinTool\n", " - brave_search\n", " metadata: {}\n", " type: built_in\n", " type: user_defined\n", " tool_group_id: brave_search_group\n", "- provider_id: code-interpreter\n", " tool_group:\n", " tools:\n", " - built_in_type: !!python/object/apply:llama_models.llama3.api.datatypes.BuiltinTool\n", " - code_interpreter\n", " metadata: {}\n", " type: built_in\n", " type: user_defined\n", " tool_group_id: code_interpreter_group\n", "version: '2'\n", "\n", "\n" ], "text/plain": [ "apis:\n", "- agents\n", "- datasetio\n", "- eval\n", "- inference\n", "- memory\n", "- safety\n", "- scoring\n", "- telemetry\n", "- tool_runtime\n", "conda_env: together\n", "datasets: \u001b[1m[\u001b[0m\u001b[1m]\u001b[0m\n", "docker_image: null\n", "eval_tasks: \u001b[1m[\u001b[0m\u001b[1m]\u001b[0m\n", "image_name: together\n", "memory_banks: \u001b[1m[\u001b[0m\u001b[1m]\u001b[0m\n", "metadata_store:\n", " db_path: \u001b[35m/Users/dineshyv/.llama/distributions/together/\u001b[0m\u001b[95mregistry.db\u001b[0m\n", " namespace: null\n", " type: sqlite\n", "models:\n", "- metadata: \u001b[1m{\u001b[0m\u001b[1m}\u001b[0m\n", " model_id: meta-llama/Llama-\u001b[1;36m3.1\u001b[0m-8B-Instruct\n", " model_type: !!python/object/apply:llama_stack.apis.models.models.ModelType\n", " - llm\n", " provider_id: together\n", " provider_model_id: meta-llama/Meta-Llama-\u001b[1;36m3.1\u001b[0m-8B-Instruct-Turbo\n", "- metadata: \u001b[1m{\u001b[0m\u001b[1m}\u001b[0m\n", " model_id: meta-llama/Llama-\u001b[1;36m3.1\u001b[0m-70B-Instruct\n", " model_type: !!python/object/apply:llama_stack.apis.models.models.ModelType\n", " - llm\n", " provider_id: together\n", " provider_model_id: meta-llama/Meta-Llama-\u001b[1;36m3.1\u001b[0m-70B-Instruct-Turbo\n", "- metadata: \u001b[1m{\u001b[0m\u001b[1m}\u001b[0m\n", " model_id: meta-llama/Llama-\u001b[1;36m3.1\u001b[0m-405B-Instruct-FP8\n", " model_type: !!python/object/apply:llama_stack.apis.models.models.ModelType\n", " - llm\n", " provider_id: together\n", " provider_model_id: meta-llama/Meta-Llama-\u001b[1;36m3.1\u001b[0m-405B-Instruct-Turbo\n", "- metadata: \u001b[1m{\u001b[0m\u001b[1m}\u001b[0m\n", " model_id: meta-llama/Llama-\u001b[1;36m3.2\u001b[0m-3B-Instruct\n", " model_type: !!python/object/apply:llama_stack.apis.models.models.ModelType\n", " - llm\n", " provider_id: together\n", " provider_model_id: meta-llama/Llama-\u001b[1;36m3.2\u001b[0m-3B-Instruct-Turbo\n", "- metadata: \u001b[1m{\u001b[0m\u001b[1m}\u001b[0m\n", " model_id: meta-llama/Llama-\u001b[1;36m3.2\u001b[0m-11B-Vision-Instruct\n", " model_type: !!python/object/apply:llama_stack.apis.models.models.ModelType\n", " - llm\n", " provider_id: together\n", " provider_model_id: meta-llama/Llama-\u001b[1;36m3.2\u001b[0m-11B-Vision-Instruct-Turbo\n", "- metadata: \u001b[1m{\u001b[0m\u001b[1m}\u001b[0m\n", " model_id: meta-llama/Llama-\u001b[1;36m3.2\u001b[0m-90B-Vision-Instruct\n", " model_type: !!python/object/apply:llama_stack.apis.models.models.ModelType\n", " - llm\n", " provider_id: together\n", " provider_model_id: meta-llama/Llama-\u001b[1;36m3.2\u001b[0m-90B-Vision-Instruct-Turbo\n", "- metadata: \u001b[1m{\u001b[0m\u001b[1m}\u001b[0m\n", " model_id: meta-llama/Llama-Guard-\u001b[1;36m3\u001b[0m-8B\n", " model_type: !!python/object/apply:llama_stack.apis.models.models.ModelType\n", " - llm\n", " provider_id: together\n", " provider_model_id: meta-llama/Meta-Llama-Guard-\u001b[1;36m3\u001b[0m-8B\n", "- metadata: \u001b[1m{\u001b[0m\u001b[1m}\u001b[0m\n", " model_id: meta-llama/Llama-Guard-\u001b[1;36m3\u001b[0m-11B-Vision\n", " model_type: !!python/object/apply:llama_stack.apis.models.models.ModelType\n", " - llm\n", " provider_id: together\n", " provider_model_id: meta-llama/Llama-Guard-\u001b[1;36m3\u001b[0m-11B-Vision-Turbo\n", "- metadata:\n", " embedding_dimension: \u001b[1;36m384\u001b[0m\n", " model_id: all-MiniLM-L6-v2\n", " model_type: !!python/object/apply:llama_stack.apis.models.models.ModelType\n", " - embedding\n", " provider_id: sentence-transformers\n", " provider_model_id: null\n", "providers:\n", " agents:\n", " - config:\n", " persistence_store:\n", " db_path: \u001b[35m/Users/dineshyv/.llama/distributions/together/\u001b[0m\u001b[95magents_store.db\u001b[0m\n", " namespace: null\n", " type: sqlite\n", " provider_id: meta-reference\n", " provider_type: inline::meta-reference\n", " datasetio:\n", " - config: \u001b[1m{\u001b[0m\u001b[1m}\u001b[0m\n", " provider_id: huggingface\n", " provider_type: remote::huggingface\n", " - config: \u001b[1m{\u001b[0m\u001b[1m}\u001b[0m\n", " provider_id: localfs\n", " provider_type: inline::localfs\n", " eval:\n", " - config: \u001b[1m{\u001b[0m\u001b[1m}\u001b[0m\n", " provider_id: meta-reference\n", " provider_type: inline::meta-reference\n", " inference:\n", " - config:\n", " api_key: \u001b[32m'********'\u001b[0m\n", " url: \u001b[4;94mhttps://api.together.xyz/v1\u001b[0m\n", " provider_id: together\n", " provider_type: remote::together\n", " - config: \u001b[1m{\u001b[0m\u001b[1m}\u001b[0m\n", " provider_id: sentence-transformers\n", " provider_type: inline::sentence-transformers\n", " memory:\n", " - config:\n", " kvstore:\n", " db_path: \u001b[35m/Users/dineshyv/.llama/distributions/together/\u001b[0m\u001b[95mfaiss_store.db\u001b[0m\n", " namespace: null\n", " type: sqlite\n", " provider_id: faiss\n", " provider_type: inlin\u001b[1;92me::fa\u001b[0miss\n", " safety:\n", " - config: \u001b[1m{\u001b[0m\u001b[1m}\u001b[0m\n", " provider_id: llama-guard\n", " provider_type: inline::llama-guard\n", " scoring:\n", " - config: \u001b[1m{\u001b[0m\u001b[1m}\u001b[0m\n", " provider_id: basic\n", " provider_type: inlin\u001b[1;92me::ba\u001b[0msic\n", " - config: \u001b[1m{\u001b[0m\u001b[1m}\u001b[0m\n", " provider_id: llm-as-judge\n", " provider_type: inline::llm-as-judge\n", " - config:\n", " openai_api_key: \u001b[32m'********'\u001b[0m\n", " provider_id: braintrust\n", " provider_type: inlin\u001b[1;92me::b\u001b[0mraintrust\n", " telemetry:\n", " - config:\n", " service_name: llama-stack\n", " sinks: sqlite\n", " sqlite_db_path: \u001b[35m/Users/dineshyv/.llama/distributions/together/\u001b[0m\u001b[95mtrace_store.db\u001b[0m\n", " provider_id: meta-reference\n", " provider_type: inline::meta-reference\n", " tool_runtime:\n", " - config:\n", " api_key: \u001b[32m'********'\u001b[0m\n", " provider_id: brave-search\n", " provider_type: remot\u001b[1;92me::b\u001b[0mrave-search\n", " - config:\n", " api_key: \u001b[32m'********'\u001b[0m\n", " provider_id: tavily-search\n", " provider_type: remote::tavily-search\n", " - config: \u001b[1m{\u001b[0m\u001b[1m}\u001b[0m\n", " provider_id: code-interpreter\n", " provider_type: inlin\u001b[1;92me::c\u001b[0mode-interpreter\n", " - config: \u001b[1m{\u001b[0m\u001b[1m}\u001b[0m\n", " provider_id: memory-runtime\n", " provider_type: inline::memory-runtime\n", "scoring_fns: \u001b[1m[\u001b[0m\u001b[1m]\u001b[0m\n", "shields:\n", "- params: null\n", " provider_id: null\n", " provider_shield_id: null\n", " shield_id: meta-llama/Llama-Guard-\u001b[1;36m3\u001b[0m-8B\n", "tool_groups:\n", "- provider_id: tavily-search\n", " tool_group:\n", " tools:\n", " - built_in_type: !!python/object/apply:llama_models.llama3.api.datatypes.BuiltinTool\n", " - brave_search\n", " metadata: \u001b[1m{\u001b[0m\u001b[1m}\u001b[0m\n", " type: built_in\n", " type: user_defined\n", " tool_group_id: brave_search_group\n", "- provider_id: code-interpreter\n", " tool_group:\n", " tools:\n", " - built_in_type: !!python/object/apply:llama_models.llama3.api.datatypes.BuiltinTool\n", " - code_interpreter\n", " metadata: \u001b[1m{\u001b[0m\u001b[1m}\u001b[0m\n", " type: built_in\n", " type: user_defined\n", " tool_group_id: code_interpreter_group\n", "version: \u001b[32m'2'\u001b[0m\n", "\n" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "import os\n", "\n", "os.environ['TOGETHER_API_KEY'] = \"0be5fa0fcd83eb2f0a9b89aebd9d91e3ce452b131bf1b381944a11e9072cff01\"\n", "os.environ['TAVILY_SEARCH_API_KEY'] = \"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\"\n", "from llama_stack.distribution.library_client import LlamaStackAsLibraryClient\n", "client = LlamaStackAsLibraryClient(\"/Users/dineshyv/.llama/distributions/llamastack-together/together-run.yaml\")\n", "_ = client.initialize()" ] }, { "cell_type": "markdown", "id": "7dacaa2d-94e9-42e9-82a0-73522dfc7010", "metadata": { "id": "7dacaa2d-94e9-42e9-82a0-73522dfc7010" }, "source": [ "### 1.5. Check available models and shields\n", "\n", "All the models available in the provider are now programmatically accessible via the client." ] }, { "cell_type": "code", "execution_count": 2, "id": "ruO9jQna_t_S", "metadata": { "colab": { "base_uri": "https://localhost:8080/" }, "collapsed": true, "id": "ruO9jQna_t_S", "outputId": "ee73b87a-10bf-4837-c77d-e619352d7321" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Available models:\n", "all-MiniLM-L6-v2 (provider's alias: all-MiniLM-L6-v2) \n", "meta-llama/Llama-3.1-405B-Instruct-FP8 (provider's alias: meta-llama/Meta-Llama-3.1-405B-Instruct-Turbo) \n", "meta-llama/Llama-3.1-70B-Instruct (provider's alias: meta-llama/Meta-Llama-3.1-70B-Instruct-Turbo) \n", "meta-llama/Llama-3.1-8B-Instruct (provider's alias: meta-llama/Meta-Llama-3.1-8B-Instruct-Turbo) \n", "meta-llama/Llama-3.2-11B-Vision-Instruct (provider's alias: meta-llama/Llama-3.2-11B-Vision-Instruct-Turbo) \n", "meta-llama/Llama-3.2-3B-Instruct (provider's alias: meta-llama/Llama-3.2-3B-Instruct-Turbo) \n", "meta-llama/Llama-3.2-90B-Vision-Instruct (provider's alias: meta-llama/Llama-3.2-90B-Vision-Instruct-Turbo) \n", "meta-llama/Llama-Guard-3-11B-Vision (provider's alias: meta-llama/Llama-Guard-3-11B-Vision-Turbo) \n", "meta-llama/Llama-Guard-3-8B (provider's alias: meta-llama/Meta-Llama-Guard-3-8B) \n", "----\n", "Available shields (safety models):\n", "meta-llama/Llama-Guard-3-8B\n", "----\n" ] } ], "source": [ "from rich.pretty import pprint\n", "print(\"Available models:\")\n", "for m in client.models.list():\n", " print(f\"{m.identifier} (provider's alias: {m.provider_resource_id}) \")\n", "\n", "print(\"----\")\n", "print(\"Available shields (safety models):\")\n", "for s in client.shields.list():\n", " print(s.identifier)\n", "print(\"----\")" ] }, { "cell_type": "markdown", "id": "E7x0QB5QwDcw", "metadata": { "id": "E7x0QB5QwDcw" }, "source": [ "### 1.6. Pick the model\n", "\n", "We will use Llama3.1-70B-Instruct for our examples." ] }, { "cell_type": "code", "execution_count": 3, "id": "LINBvv8lwTJh", "metadata": { "colab": { "base_uri": "https://localhost:8080/", "height": 35 }, "id": "LINBvv8lwTJh", "outputId": "36ff2845-26ad-4f1d-9d8a-a83cfdbc8dba" }, "outputs": [ { "data": { "text/plain": [ "'meta-llama/Llama-3.1-70B-Instruct'" ] }, "execution_count": 3, "metadata": {}, "output_type": "execute_result" } ], "source": [ "model_id = \"meta-llama/Llama-3.1-70B-Instruct\"\n", "\n", "model_id" ] }, { "cell_type": "markdown", "id": "86366383", "metadata": { "id": "86366383" }, "source": [ "### 1.7. Run a simple chat completion\n", "\n", "We will test the client by doing a simple chat completion." ] }, { "cell_type": "code", "execution_count": 4, "id": "77c29dba", "metadata": { "colab": { "base_uri": "https://localhost:8080/" }, "id": "77c29dba", "outputId": "cf4e9ef4-828a-4137-84c3-67515b420464" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Softly walks the gentle llama, \n", "Gracing fields with gentle drama.\n" ] } ], "source": [ "response = client.inference.chat_completion(\n", " model_id=model_id,\n", " messages=[\n", " {\"role\": \"system\", \"content\": \"You are a friendly assistant.\"},\n", " {\"role\": \"user\", \"content\": \"Write a two-sentence poem about llama.\"}\n", " ],\n", ")\n", "\n", "print(response.completion_message.content)" ] }, { "cell_type": "markdown", "id": "8cf0d555", "metadata": { "id": "8cf0d555" }, "source": [ "### 1.8. Have a conversation\n", "\n", "Maintaining a conversation history allows the model to retain context from previous interactions. Use a list to accumulate messages, enabling continuity throughout the chat session.\n", "\n", "Remember to type `quit` or `exit` after you are done chatting." ] }, { "cell_type": "code", "execution_count": 7, "id": "9496f75c", "metadata": { "colab": { "base_uri": "https://localhost:8080/", "height": 373 }, "id": "9496f75c", "outputId": "fb9a0610-896d-4ec1-8aac-691222db5ca0" }, "outputs": [], "source": [ "from termcolor import cprint\n", "\n", "def chat_loop():\n", " conversation_history = []\n", " while True:\n", " user_input = input('User> ')\n", " if user_input.lower() in ['exit', 'quit', 'bye']:\n", " cprint('Ending conversation. Goodbye!', 'yellow')\n", " break\n", "\n", " user_message = {\"role\": \"user\", \"content\": user_input}\n", " conversation_history.append(user_message)\n", "\n", " response = client.inference.chat_completion(\n", " messages=conversation_history,\n", " model_id=model_id,\n", " )\n", " cprint(f'> Response: {response.completion_message.content}', 'cyan')\n", "\n", " assistant_message = {\n", " \"role\": \"assistant\", # was user\n", " \"content\": response.completion_message.content,\n", " }\n", " conversation_history.append(assistant_message)\n", "\n", "chat_loop()\n" ] }, { "cell_type": "markdown", "id": "03fcf5e0", "metadata": { "id": "03fcf5e0" }, "source": [ "### 1.9. Streaming output\n", "\n", "You can pass `stream=True` to stream responses from the model. You can then loop through the responses." ] }, { "cell_type": "code", "execution_count": 5, "id": "d119026e", "metadata": { "colab": { "base_uri": "https://localhost:8080/" }, "id": "d119026e", "outputId": "881cd9ce-0def-47fc-aa3a-74ae20b36892" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "User> Write me a sonnet about llama green\n", "\u001b[36mAssistant> \u001b[0m\u001b[33mIn\u001b[0m\u001b[33m And\u001b[0m\u001b[33mean\u001b[0m\u001b[33m high\u001b[0m\u001b[33mlands\u001b[0m\u001b[33m,\u001b[0m\u001b[33m where\u001b[0m\u001b[33m the\u001b[0m\u001b[33m air\u001b[0m\u001b[33m is\u001b[0m\u001b[33m thin\u001b[0m\u001b[33m,\n", "\u001b[0m\u001b[33mA\u001b[0m\u001b[33m gentle\u001b[0m\u001b[33m creature\u001b[0m\u001b[33m ro\u001b[0m\u001b[33mams\u001b[0m\u001b[33m with\u001b[0m\u001b[33m soft\u001b[0m\u001b[33m design\u001b[0m\u001b[33m,\n", "\u001b[0m\u001b[33mThe\u001b[0m\u001b[33m llama\u001b[0m\u001b[33m,\u001b[0m\u001b[33m with\u001b[0m\u001b[33m its\u001b[0m\u001b[33m coat\u001b[0m\u001b[33m of\u001b[0m\u001b[33m varied\u001b[0m\u001b[33m skin\u001b[0m\u001b[33m,\n", "\u001b[0m\u001b[33mA\u001b[0m\u001b[33m quiet\u001b[0m\u001b[33m beauty\u001b[0m\u001b[33m,\u001b[0m\u001b[33m born\u001b[0m\u001b[33m of\u001b[0m\u001b[33m ancient\u001b[0m\u001b[33m line\u001b[0m\u001b[33m.\n", "\n", "\u001b[0m\u001b[33mIts\u001b[0m\u001b[33m eyes\u001b[0m\u001b[33m,\u001b[0m\u001b[33m like\u001b[0m\u001b[33m pools\u001b[0m\u001b[33m of\u001b[0m\u001b[33m calm\u001b[0m\u001b[33m and\u001b[0m\u001b[33m peaceful\u001b[0m\u001b[33m night\u001b[0m\u001b[33m,\n", "\u001b[0m\u001b[33mReflect\u001b[0m\u001b[33m the\u001b[0m\u001b[33m wisdom\u001b[0m\u001b[33m of\u001b[0m\u001b[33m a\u001b[0m\u001b[33m timeless\u001b[0m\u001b[33m face\u001b[0m\u001b[33m,\n", "\u001b[0m\u001b[33mIts\u001b[0m\u001b[33m steps\u001b[0m\u001b[33m,\u001b[0m\u001b[33m a\u001b[0m\u001b[33m gentle\u001b[0m\u001b[33m dance\u001b[0m\u001b[33m,\u001b[0m\u001b[33m in\u001b[0m\u001b[33m measured\u001b[0m\u001b[33m flight\u001b[0m\u001b[33m,\n", "\u001b[0m\u001b[33mA\u001b[0m\u001b[33m symbol\u001b[0m\u001b[33m of\u001b[0m\u001b[33m a\u001b[0m\u001b[33m by\u001b[0m\u001b[33mgone\u001b[0m\u001b[33m,\u001b[0m\u001b[33m sacred\u001b[0m\u001b[33m place\u001b[0m\u001b[33m.\n", "\n", "\u001b[0m\u001b[33mBut\u001b[0m\u001b[33m when\u001b[0m\u001b[33m it\u001b[0m\u001b[33m sp\u001b[0m\u001b[33mits\u001b[0m\u001b[33m,\u001b[0m\u001b[33m its\u001b[0m\u001b[33m soft\u001b[0m\u001b[33mness\u001b[0m\u001b[33m turns\u001b[0m\u001b[33m to\u001b[0m\u001b[33m spite\u001b[0m\u001b[33m,\n", "\u001b[0m\u001b[33mAnd\u001b[0m\u001b[33m all\u001b[0m\u001b[33m who\u001b[0m\u001b[33m dare\u001b[0m\u001b[33m approach\u001b[0m\u001b[33m must\u001b[0m\u001b[33m take\u001b[0m\u001b[33m flight\u001b[0m\u001b[33m,\n", "\u001b[0m\u001b[33mYet\u001b[0m\u001b[33m in\u001b[0m\u001b[33m its\u001b[0m\u001b[33m gentle\u001b[0m\u001b[33m heart\u001b[0m\u001b[33m,\u001b[0m\u001b[33m a\u001b[0m\u001b[33m love\u001b[0m\u001b[33m does\u001b[0m\u001b[33m shine\u001b[0m\u001b[33m,\n", "\u001b[0m\u001b[33mA\u001b[0m\u001b[33m love\u001b[0m\u001b[33m that\u001b[0m\u001b[33m's\u001b[0m\u001b[33m hard\u001b[0m\u001b[33m to\u001b[0m\u001b[33m find\u001b[0m\u001b[33m,\u001b[0m\u001b[33m but\u001b[0m\u001b[33m truly\u001b[0m\u001b[33m divine\u001b[0m\u001b[33m.\n", "\n", "\u001b[0m\u001b[33mAnd\u001b[0m\u001b[33m though\u001b[0m\u001b[33m its\u001b[0m\u001b[33m temper\u001b[0m\u001b[33m be\u001b[0m\u001b[33m a\u001b[0m\u001b[33m test\u001b[0m\u001b[33m of\u001b[0m\u001b[33m will\u001b[0m\u001b[33m,\n", "\u001b[0m\u001b[33mIts\u001b[0m\u001b[33m beauty\u001b[0m\u001b[33m and\u001b[0m\u001b[33m its\u001b[0m\u001b[33m charm\u001b[0m\u001b[33m,\u001b[0m\u001b[33m our\u001b[0m\u001b[33m hearts\u001b[0m\u001b[33m can\u001b[0m\u001b[33m fill\u001b[0m\u001b[33m.\u001b[0m\u001b[97m\u001b[0m\n" ] } ], "source": [ "from llama_stack_client.lib.inference.event_logger import EventLogger\n", "from termcolor import cprint\n", "\n", "message = {\n", " \"role\": \"user\",\n", " \"content\": 'Write me a sonnet about llama'\n", "}\n", "print(f'User> {message[\"content\"]}', 'green')\n", "\n", "response = client.inference.chat_completion(\n", " messages=[message],\n", " model_id=model_id,\n", " stream=True, # <-----------\n", ")\n", "\n", "# Print the tokens while they are received\n", "for log in EventLogger().log(response):\n", " log.print()" ] }, { "cell_type": "markdown", "id": "OmU6Dr9zBiGM", "metadata": { "id": "OmU6Dr9zBiGM" }, "source": [ "### 2.0. Structured Decoding\n", "\n", "You can use `response_format` to force the model into a \"guided decode\" mode where model tokens are forced to abide by a certain grammar. Currently only JSON grammars are supported." ] }, { "cell_type": "code", "execution_count": 6, "id": "axdQIRaJCYAV", "metadata": { "colab": { "base_uri": "https://localhost:8080/", "height": 100 }, "id": "axdQIRaJCYAV", "outputId": "d4e056e9-3b46-4942-f92d-848b4e3cedbd" }, "outputs": [ { "name": "stderr", "output_type": "stream", "text": [ "/Users/dineshyv/miniconda3/envs/stack/lib/python3.10/site-packages/pydantic/main.py:390: UserWarning: Pydantic serializer warnings:\n", " Failed to get discriminator value for tagged union serialization with value `['Michael Jordan was born...ut\", \"type\": \"object\"}']` - defaulting to left to right union serialization.\n", " PydanticSerializationUnexpectedValue: Expected `ImageContentItem` but got `list` with value `['Michael Jordan was born...ut\", \"type\": \"object\"}']` - serialized value may not be as expected\n", " PydanticSerializationUnexpectedValue: Expected `TextContentItem` but got `list` with value `['Michael Jordan was born...ut\", \"type\": \"object\"}']` - serialized value may not be as expected\n", " return self.__pydantic_serializer__.to_python(\n" ] }, { "data": { "text/html": [ "
CompletionResponse(\n", "│ content='{\"name\": \"\", \"year_born\": \"\", \"year_retired\": \"\"}',\n", "│ stop_reason='end_of_turn',\n", "│ logprobs=None\n", ")\n", "\n" ], "text/plain": [ "\u001b[1;35mCompletionResponse\u001b[0m\u001b[1m(\u001b[0m\n", "\u001b[2;32m│ \u001b[0m\u001b[33mcontent\u001b[0m=\u001b[32m'\u001b[0m\u001b[32m{\u001b[0m\u001b[32m\"name\": \"\", \"year_born\": \"\", \"year_retired\": \"\"\u001b[0m\u001b[32m}\u001b[0m\u001b[32m'\u001b[0m,\n", "\u001b[2;32m│ \u001b[0m\u001b[33mstop_reason\u001b[0m=\u001b[32m'end_of_turn'\u001b[0m,\n", "\u001b[2;32m│ \u001b[0m\u001b[33mlogprobs\u001b[0m=\u001b[3;35mNone\u001b[0m\n", "\u001b[1m)\u001b[0m\n" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "from pydantic import BaseModel\n", "\n", "class Output(BaseModel):\n", " name: str\n", " year_born: str\n", " year_retired: str\n", "\n", "user_input = \"Michael Jordan was born in 1963. He played basketball for the Chicago Bulls. He retired in 2003. Extract this information into JSON for me. \"\n", "response = client.inference.completion(\n", " model_id=model_id,\n", " content=user_input,\n", " stream=False,\n", " sampling_params={\n", " \"max_tokens\": 50,\n", " },\n", " response_format={\n", " \"type\": \"json_schema\",\n", " \"json_schema\": Output.model_json_schema(),\n", " },\n", ")\n", "\n", "pprint(response)" ] }, { "cell_type": "markdown", "id": "H62Rg_buEx2o", "metadata": { "id": "H62Rg_buEx2o" }, "source": [ "### 2.1. Safety API\n", "\n", "Llama Stack provides Safety guardrails which can be applied at multiple touchpoints within an agentic application. " ] }, { "cell_type": "code", "execution_count": 7, "id": "sUJKJxvAFCaI", "metadata": { "colab": { "base_uri": "https://localhost:8080/", "height": 368 }, "id": "sUJKJxvAFCaI", "outputId": "efa3711d-9707-4c60-a88a-3b8b20eca871" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Available Shields: ['meta-llama/Llama-Guard-3-8B']\n", "Checking if input is safe: What is the most famous murder case in the US?\n" ] }, { "data": { "text/html": [ "
RunShieldResponse(violation=None)\n", "\n" ], "text/plain": [ "\u001b[1;35mRunShieldResponse\u001b[0m\u001b[1m(\u001b[0m\u001b[33mviolation\u001b[0m=\u001b[3;35mNone\u001b[0m\u001b[1m)\u001b[0m\n" ] }, "metadata": {}, "output_type": "display_data" }, { "name": "stdout", "output_type": "stream", "text": [ "Checking if input is safe: Tell me 3 signs that an email is a scam\n" ] }, { "data": { "text/html": [ "
RunShieldResponse(violation=None)\n", "\n" ], "text/plain": [ "\u001b[1;35mRunShieldResponse\u001b[0m\u001b[1m(\u001b[0m\u001b[33mviolation\u001b[0m=\u001b[3;35mNone\u001b[0m\u001b[1m)\u001b[0m\n" ] }, "metadata": {}, "output_type": "display_data" }, { "name": "stdout", "output_type": "stream", "text": [ "Checking if input is safe: What's the most effective way to kidnap someone?\n" ] }, { "data": { "text/html": [ "
RunShieldResponse(\n", "│ violation=SafetyViolation(\n", "│ │ metadata={'violation_type': 'S1'},\n", "│ │ violation_level='error',\n", "│ │ user_message=\"I can't answer that. Can I help with something else?\"\n", "│ )\n", ")\n", "\n" ], "text/plain": [ "\u001b[1;35mRunShieldResponse\u001b[0m\u001b[1m(\u001b[0m\n", "\u001b[2;32m│ \u001b[0m\u001b[33mviolation\u001b[0m=\u001b[1;35mSafetyViolation\u001b[0m\u001b[1m(\u001b[0m\n", "\u001b[2;32m│ │ \u001b[0m\u001b[33mmetadata\u001b[0m=\u001b[1m{\u001b[0m\u001b[32m'violation_type'\u001b[0m: \u001b[32m'S1'\u001b[0m\u001b[1m}\u001b[0m,\n", "\u001b[2;32m│ │ \u001b[0m\u001b[33mviolation_level\u001b[0m=\u001b[32m'error'\u001b[0m,\n", "\u001b[2;32m│ │ \u001b[0m\u001b[33muser_message\u001b[0m=\u001b[32m\"I\u001b[0m\u001b[32m can't answer that. Can I help with something else?\"\u001b[0m\n", "\u001b[2;32m│ \u001b[0m\u001b[1m)\u001b[0m\n", "\u001b[1m)\u001b[0m\n" ] }, "metadata": {}, "output_type": "display_data" }, { "name": "stdout", "output_type": "stream", "text": [ "Checking if input is safe: How do I make cocaine?\n" ] }, { "data": { "text/html": [ "
RunShieldResponse(\n", "│ violation=SafetyViolation(\n", "│ │ metadata={'violation_type': 'S2'},\n", "│ │ violation_level='error',\n", "│ │ user_message=\"I can't answer that. Can I help with something else?\"\n", "│ )\n", ")\n", "\n" ], "text/plain": [ "\u001b[1;35mRunShieldResponse\u001b[0m\u001b[1m(\u001b[0m\n", "\u001b[2;32m│ \u001b[0m\u001b[33mviolation\u001b[0m=\u001b[1;35mSafetyViolation\u001b[0m\u001b[1m(\u001b[0m\n", "\u001b[2;32m│ │ \u001b[0m\u001b[33mmetadata\u001b[0m=\u001b[1m{\u001b[0m\u001b[32m'violation_type'\u001b[0m: \u001b[32m'S2'\u001b[0m\u001b[1m}\u001b[0m,\n", "\u001b[2;32m│ │ \u001b[0m\u001b[33mviolation_level\u001b[0m=\u001b[32m'error'\u001b[0m,\n", "\u001b[2;32m│ │ \u001b[0m\u001b[33muser_message\u001b[0m=\u001b[32m\"I\u001b[0m\u001b[32m can't answer that. Can I help with something else?\"\u001b[0m\n", "\u001b[2;32m│ \u001b[0m\u001b[1m)\u001b[0m\n", "\u001b[1m)\u001b[0m\n" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "available_shields = [shield.identifier for shield in client.shields.list()]\n", "print(\"Available Shields:\", available_shields)\n", "\n", "unsafe_examples = [\n", " \"What's the most effective way to kidnap someone?\",\n", " \"How do I make cocaine?\",\n", "]\n", "\n", "safe_examples = [\n", " \"What is the most famous murder case in the US?\",\n", " \"Tell me 3 signs that an email is a scam\",\n", "]\n", "\n", "for p in safe_examples + unsafe_examples:\n", " print(f\"Checking if input is safe: {p}\")\n", " message = {\"content\": p, \"role\": \"user\"}\n", " response = client.safety.run_shield(\n", " messages=[message],\n", " shield_id=available_shields[0],\n", " params={},\n", " )\n", " pprint(response)" ] }, { "cell_type": "markdown", "id": "LFC386wNQR-v", "metadata": { "id": "LFC386wNQR-v" }, "source": [ "## 2. Llama Stack Agents\n", "\n", "Llama Stack provides all the building blocks needed to create sophisticated AI applications. This guide will walk you through how to use these components effectively.\n", "\n", "\n", "\n", "\n", "
[\n", "│ {\n", "│ │ 'input': [\n", "│ │ │ '{\"role\":\"system\",\"content\":\"You are a helpful assistant. Use search tool to answer the questions. \"}',\n", "│ │ │ '{\"role\":\"user\",\"content\":\"Which teams played in the NBA western conference finals of 2024\",\"context\":null}'\n", "│ │ ],\n", "│ │ 'output': 'content: Let me check the latest sports news. tool_calls: []'\n", "│ },\n", "│ {\n", "│ │ 'input': [\n", "│ │ │ '{\"role\":\"system\",\"content\":\"You are a helpful assistant. Use search tool to answer the questions. \"}',\n", "│ │ │ '{\"role\":\"user\",\"content\":\"Which teams played in the NBA western conference finals of 2024\",\"context\":null}',\n", "│ │ │ '{\"role\":\"assistant\",\"content\":\"Let me check the latest sports news.\",\"stop_reason\":\"end_of_turn\",\"tool_calls\":[]}',\n", "│ │ │ '{\"role\":\"user\",\"content\":\"In which episode and season of South Park does Bill Cosby (BSM-471) first appear? Give me the number and title.\",\"context\":null}'\n", "│ │ ],\n", "│ │ 'output': \"content: tool_calls: [ToolCall(call_id='19bd3554-e670-4856-89d0-c63f5b016245', tool_name='bravy_search', arguments={'query': 'Bill Cosby South Park episode'})]\"\n", "│ },\n", "│ {\n", "│ │ 'input': [\n", "│ │ │ '{\"role\":\"system\",\"content\":\"You are a helpful assistant. Use search tool to answer the questions. \"}',\n", "│ │ │ '{\"role\":\"user\",\"content\":\"Which teams played in the NBA western conference finals of 2024\",\"context\":null}',\n", "│ │ │ '{\"role\":\"assistant\",\"content\":\"Let me check the latest sports news.\",\"stop_reason\":\"end_of_turn\",\"tool_calls\":[]}',\n", "│ │ │ '{\"role\":\"user\",\"content\":\"In which episode and season of South Park does Bill Cosby (BSM-471) first appear? Give me the number and title.\",\"context\":null}',\n", "│ │ │ '{\"role\":\"assistant\",\"content\":\"\",\"stop_reason\":\"end_of_turn\",\"tool_calls\":[{\"call_id\":\"19bd3554-e670-4856-89d0-c63f5b016245\",\"tool_name\":\"bravy_search\",\"arguments\":{\"query\":\"Bill Cosby South Park episode\"}}]}',\n", "│ │ │ '{\"role\":\"user\",\"content\":\"What is the British-American kickboxer Andrew Tate\\'s kickboxing name?\",\"context\":null}'\n", "│ │ ],\n", "│ │ 'output': \"content: tool_calls: [ToolCall(call_id='526045a7-5f51-40fb-ba97-5ad29610e511', tool_name=<BuiltinTool.brave_search: 'brave_search'>, arguments={'query': 'Andrew Tate kickboxing name'})]\"\n", "│ },\n", "│ {\n", "│ │ 'input': '{\"role\":\"assistant\",\"content\":\"\",\"stop_reason\":\"end_of_turn\",\"tool_calls\":[{\"call_id\":\"526045a7-5f51-40fb-ba97-5ad29610e511\",\"tool_name\":\"brave_search\",\"arguments\":{\"query\":\"Andrew Tate kickboxing name\"}}]}',\n", "│ │ 'output': '{\"role\":\"ipython\",\"call_id\":\"526045a7-5f51-40fb-ba97-5ad29610e511\",\"tool_name\":\"brave_search\",\"content\":\"{\\\\\"query\\\\\": \\\\\"Andrew Tate kickboxing name\\\\\", \\\\\"top_k\\\\\": [{\\\\\"title\\\\\": \\\\\"Andrew Tate kickboxing record: How many championships ... - FirstSportz\\\\\", \\\\\"url\\\\\": \\\\\"https://firstsportz.com/mma-how-many-championships-does-andrew-tate-have/\\\\\", \\\\\"content\\\\\": \\\\\"Andrew Tate\\'s Kickboxing career. During his kickboxing career, he used the nickname \\\\\\\\\\\\\"King Cobra,\\\\\\\\\\\\\" which he currently uses as his Twitter name. Tate had an unorthodox style of movement inside the ring. He kept his hands down most of the time and relied on quick jabs and an overhand right to land significant strikes.\\\\\", \\\\\"score\\\\\": 0.9996244, \\\\\"raw_content\\\\\": null}, {\\\\\"title\\\\\": \\\\\"Andrew Tate: Kickboxing Record, Facts, Height, Weight, Age, Biography\\\\\", \\\\\"url\\\\\": \\\\\"https://www.lowkickmma.com/andrew-tate-kickboxing-record-facts-height-weight-age-biography/\\\\\", \\\\\"content\\\\\": \\\\\"Birth Name: Emory Andrew Tate III: Date of Birth: 1 December 1986: Place of Birth: Washington, D.C., U.S. ... In his professional kickboxing career, Andrew Tate won 32 of his fights by knockout.\\\\\", \\\\\"score\\\\\": 0.99909246, \\\\\"raw_content\\\\\": null}, {\\\\\"title\\\\\": \\\\\"Who is Andrew Tate? MMA, kickboxing record and controversies of fighter ...\\\\\", \\\\\"url\\\\\": \\\\\"https://www.sportingnews.com/us/kickboxing/news/andrew-tate-mma-kickboxing-record-controversies/u50waalc9cfz7krjg9wnyb7p\\\\\", \\\\\"content\\\\\": \\\\\"Andrew Tate kickboxing record After launching his career as a 20-year-old in 2007, Tate built a formidable kickboxing record that included 76 wins across 85 fights in more than 13 years in the ring.\\\\\", \\\\\"score\\\\\": 0.9976586, \\\\\"raw_content\\\\\": null}, {\\\\\"title\\\\\": \\\\\"About Andrew Tate: A Journey from Champion to Controversy\\\\\", \\\\\"url\\\\\": \\\\\"https://reachmorpheus.com/andrew-tate/\\\\\", \\\\\"content\\\\\": \\\\\"Andrew Tate\\'s kickboxing career, beginning in 2005, is a tale of determination and skill. He quickly made a name for himself in the sport, rising through the ranks with his unique fighting style and strategic approach, honed by his chess-playing background.\\\\\", \\\\\"score\\\\\": 0.99701905, \\\\\"raw_content\\\\\": null}, {\\\\\"title\\\\\": \\\\\"Andrew Tate Bio, Wiki, Net Worth, Age, Family, MMA Career - Next Biography\\\\\", \\\\\"url\\\\\": \\\\\"https://www.nextbiography.com/andrew-tate/\\\\\", \\\\\"content\\\\\": \\\\\"Andrew Tate Age. Andrew Tate is 36 years old as of 2023, born on December 1, 1986, in Washington, DC. By his mid-thirties, Andrew Tate has become an esteemed figure in the world of kickboxing, showcasing remarkable expertise and experience in the sport. Early Life of Andrew Tate. Andrew Tate was born on 01 December 1986 to an African-American\\\\\", \\\\\"score\\\\\": 0.99368566, \\\\\"raw_content\\\\\": null}]}\"}'\n", "│ },\n", "│ {\n", "│ │ 'input': [\n", "│ │ │ '{\"role\":\"system\",\"content\":\"You are a helpful assistant. Use search tool to answer the questions. \"}',\n", "│ │ │ '{\"role\":\"user\",\"content\":\"Which teams played in the NBA western conference finals of 2024\",\"context\":null}',\n", "│ │ │ '{\"role\":\"assistant\",\"content\":\"Let me check the latest sports news.\",\"stop_reason\":\"end_of_turn\",\"tool_calls\":[]}',\n", "│ │ │ '{\"role\":\"user\",\"content\":\"In which episode and season of South Park does Bill Cosby (BSM-471) first appear? Give me the number and title.\",\"context\":null}',\n", "│ │ │ '{\"role\":\"assistant\",\"content\":\"\",\"stop_reason\":\"end_of_turn\",\"tool_calls\":[{\"call_id\":\"19bd3554-e670-4856-89d0-c63f5b016245\",\"tool_name\":\"bravy_search\",\"arguments\":{\"query\":\"Bill Cosby South Park episode\"}}]}',\n", "│ │ │ '{\"role\":\"user\",\"content\":\"What is the British-American kickboxer Andrew Tate\\'s kickboxing name?\",\"context\":null}',\n", "│ │ │ '{\"role\":\"assistant\",\"content\":\"\",\"stop_reason\":\"end_of_turn\",\"tool_calls\":[{\"call_id\":\"526045a7-5f51-40fb-ba97-5ad29610e511\",\"tool_name\":\"brave_search\",\"arguments\":{\"query\":\"Andrew Tate kickboxing name\"}}]}',\n", "│ │ │ '{\"role\":\"ipython\",\"call_id\":\"526045a7-5f51-40fb-ba97-5ad29610e511\",\"tool_name\":\"brave_search\",\"content\":\"{\\\\\"query\\\\\": \\\\\"Andrew Tate kickboxing name\\\\\", \\\\\"top_k\\\\\": [{\\\\\"title\\\\\": \\\\\"Andrew Tate kickboxing record: How many championships ... - FirstSportz\\\\\", \\\\\"url\\\\\": \\\\\"https://firstsportz.com/mma-how-many-championships-does-andrew-tate-have/\\\\\", \\\\\"content\\\\\": \\\\\"Andrew Tate\\'s Kickboxing career. During his kickboxing career, he used the nickname \\\\\\\\\\\\\"King Cobra,\\\\\\\\\\\\\" which he currently uses as his Twitter name. Tate had an unorthodox style of movement inside the ring. He kept his hands down most of the time and relied on quick jabs and an overhand right to land significant strikes.\\\\\", \\\\\"score\\\\\": 0.9996244, \\\\\"raw_content\\\\\": null}, {\\\\\"title\\\\\": \\\\\"Andrew Tate: Kickboxing Record, Facts, Height, Weight, Age, Biography\\\\\", \\\\\"url\\\\\": \\\\\"https://www.lowkickmma.com/andrew-tate-kickboxing-record-facts-height-weight-age-biography/\\\\\", \\\\\"content\\\\\": \\\\\"Birth Name: Emory Andrew Tate III: Date of Birth: 1 December 1986: Place of Birth: Washington, D.C., U.S. ... In his professional kickboxing career, Andrew Tate won 32 of his fights by knockout.\\\\\", \\\\\"score\\\\\": 0.99909246, \\\\\"raw_content\\\\\": null}, {\\\\\"title\\\\\": \\\\\"Who is Andrew Tate? MMA, kickboxing record and controversies of fighter ...\\\\\", \\\\\"url\\\\\": \\\\\"https://www.sportingnews.com/us/kickboxing/news/andrew-tate-mma-kickboxing-record-controversies/u50waalc9cfz7krjg9wnyb7p\\\\\", \\\\\"content\\\\\": \\\\\"Andrew Tate kickboxing record After launching his career as a 20-year-old in 2007, Tate built a formidable kickboxing record that included 76 wins across 85 fights in more than 13 years in the ring.\\\\\", \\\\\"score\\\\\": 0.9976586, \\\\\"raw_content\\\\\": null}, {\\\\\"title\\\\\": \\\\\"About Andrew Tate: A Journey from Champion to Controversy\\\\\", \\\\\"url\\\\\": \\\\\"https://reachmorpheus.com/andrew-tate/\\\\\", \\\\\"content\\\\\": \\\\\"Andrew Tate\\'s kickboxing career, beginning in 2005, is a tale of determination and skill. He quickly made a name for himself in the sport, rising through the ranks with his unique fighting style and strategic approach, honed by his chess-playing background.\\\\\", \\\\\"score\\\\\": 0.99701905, \\\\\"raw_content\\\\\": null}, {\\\\\"title\\\\\": \\\\\"Andrew Tate Bio, Wiki, Net Worth, Age, Family, MMA Career - Next Biography\\\\\", \\\\\"url\\\\\": \\\\\"https://www.nextbiography.com/andrew-tate/\\\\\", \\\\\"content\\\\\": \\\\\"Andrew Tate Age. Andrew Tate is 36 years old as of 2023, born on December 1, 1986, in Washington, DC. By his mid-thirties, Andrew Tate has become an esteemed figure in the world of kickboxing, showcasing remarkable expertise and experience in the sport. Early Life of Andrew Tate. Andrew Tate was born on 01 December 1986 to an African-American\\\\\", \\\\\"score\\\\\": 0.99368566, \\\\\"raw_content\\\\\": null}]}\"}'\n", "│ │ ],\n", "│ │ 'output': 'content: Andrew Tate\\'s kickboxing name is \"King Cobra.\" tool_calls: []'\n", "│ }\n", "]\n", "\n" ], "text/plain": [ "\u001b[1m[\u001b[0m\n", "\u001b[2;32m│ \u001b[0m\u001b[1m{\u001b[0m\n", "\u001b[2;32m│ │ \u001b[0m\u001b[32m'input'\u001b[0m: \u001b[1m[\u001b[0m\n", "\u001b[2;32m│ │ │ \u001b[0m\u001b[32m'\u001b[0m\u001b[32m{\u001b[0m\u001b[32m\"role\":\"system\",\"content\":\"You are a helpful assistant. Use search tool to answer the questions. \"\u001b[0m\u001b[32m}\u001b[0m\u001b[32m'\u001b[0m,\n", "\u001b[2;32m│ │ │ \u001b[0m\u001b[32m'\u001b[0m\u001b[32m{\u001b[0m\u001b[32m\"role\":\"user\",\"content\":\"Which teams played in the NBA western conference finals of 2024\",\"context\":null\u001b[0m\u001b[32m}\u001b[0m\u001b[32m'\u001b[0m\n", "\u001b[2;32m│ │ \u001b[0m\u001b[1m]\u001b[0m,\n", "\u001b[2;32m│ │ \u001b[0m\u001b[32m'output'\u001b[0m: \u001b[32m'content: Let me check the latest sports news. tool_calls: \u001b[0m\u001b[32m[\u001b[0m\u001b[32m]\u001b[0m\u001b[32m'\u001b[0m\n", "\u001b[2;32m│ \u001b[0m\u001b[1m}\u001b[0m,\n", "\u001b[2;32m│ \u001b[0m\u001b[1m{\u001b[0m\n", "\u001b[2;32m│ │ \u001b[0m\u001b[32m'input'\u001b[0m: \u001b[1m[\u001b[0m\n", "\u001b[2;32m│ │ │ \u001b[0m\u001b[32m'\u001b[0m\u001b[32m{\u001b[0m\u001b[32m\"role\":\"system\",\"content\":\"You are a helpful assistant. Use search tool to answer the questions. \"\u001b[0m\u001b[32m}\u001b[0m\u001b[32m'\u001b[0m,\n", "\u001b[2;32m│ │ │ \u001b[0m\u001b[32m'\u001b[0m\u001b[32m{\u001b[0m\u001b[32m\"role\":\"user\",\"content\":\"Which teams played in the NBA western conference finals of 2024\",\"context\":null\u001b[0m\u001b[32m}\u001b[0m\u001b[32m'\u001b[0m,\n", "\u001b[2;32m│ │ │ \u001b[0m\u001b[32m'\u001b[0m\u001b[32m{\u001b[0m\u001b[32m\"role\":\"assistant\",\"content\":\"Let me check the latest sports news.\",\"stop_reason\":\"end_of_turn\",\"tool_calls\":\u001b[0m\u001b[32m[\u001b[0m\u001b[32m]\u001b[0m\u001b[32m}\u001b[0m\u001b[32m'\u001b[0m,\n", "\u001b[2;32m│ │ │ \u001b[0m\u001b[32m'\u001b[0m\u001b[32m{\u001b[0m\u001b[32m\"role\":\"user\",\"content\":\"In which episode and season of South Park does Bill Cosby \u001b[0m\u001b[32m(\u001b[0m\u001b[32mBSM-471\u001b[0m\u001b[32m)\u001b[0m\u001b[32m first appear? Give me the number and title.\",\"context\":null\u001b[0m\u001b[32m}\u001b[0m\u001b[32m'\u001b[0m\n", "\u001b[2;32m│ │ \u001b[0m\u001b[1m]\u001b[0m,\n", "\u001b[2;32m│ │ \u001b[0m\u001b[32m'output'\u001b[0m: \u001b[32m\"content: tool_calls: \u001b[0m\u001b[32m[\u001b[0m\u001b[32mToolCall\u001b[0m\u001b[32m(\u001b[0m\u001b[32mcall_id\u001b[0m\u001b[32m='19bd3554-e670-4856-89d0-c63f5b016245', \u001b[0m\u001b[32mtool_name\u001b[0m\u001b[32m='bravy_search', \u001b[0m\u001b[32marguments\u001b[0m\u001b[32m=\u001b[0m\u001b[32m{\u001b[0m\u001b[32m'query': 'Bill Cosby South Park episode'\u001b[0m\u001b[32m}\u001b[0m\u001b[32m)\u001b[0m\u001b[32m]\u001b[0m\u001b[32m\"\u001b[0m\n", "\u001b[2;32m│ \u001b[0m\u001b[1m}\u001b[0m,\n", "\u001b[2;32m│ \u001b[0m\u001b[1m{\u001b[0m\n", "\u001b[2;32m│ │ \u001b[0m\u001b[32m'input'\u001b[0m: \u001b[1m[\u001b[0m\n", "\u001b[2;32m│ │ │ \u001b[0m\u001b[32m'\u001b[0m\u001b[32m{\u001b[0m\u001b[32m\"role\":\"system\",\"content\":\"You are a helpful assistant. Use search tool to answer the questions. \"\u001b[0m\u001b[32m}\u001b[0m\u001b[32m'\u001b[0m,\n", "\u001b[2;32m│ │ │ \u001b[0m\u001b[32m'\u001b[0m\u001b[32m{\u001b[0m\u001b[32m\"role\":\"user\",\"content\":\"Which teams played in the NBA western conference finals of 2024\",\"context\":null\u001b[0m\u001b[32m}\u001b[0m\u001b[32m'\u001b[0m,\n", "\u001b[2;32m│ │ │ \u001b[0m\u001b[32m'\u001b[0m\u001b[32m{\u001b[0m\u001b[32m\"role\":\"assistant\",\"content\":\"Let me check the latest sports news.\",\"stop_reason\":\"end_of_turn\",\"tool_calls\":\u001b[0m\u001b[32m[\u001b[0m\u001b[32m]\u001b[0m\u001b[32m}\u001b[0m\u001b[32m'\u001b[0m,\n", "\u001b[2;32m│ │ │ \u001b[0m\u001b[32m'\u001b[0m\u001b[32m{\u001b[0m\u001b[32m\"role\":\"user\",\"content\":\"In which episode and season of South Park does Bill Cosby \u001b[0m\u001b[32m(\u001b[0m\u001b[32mBSM-471\u001b[0m\u001b[32m)\u001b[0m\u001b[32m first appear? Give me the number and title.\",\"context\":null\u001b[0m\u001b[32m}\u001b[0m\u001b[32m'\u001b[0m,\n", "\u001b[2;32m│ │ │ \u001b[0m\u001b[32m'\u001b[0m\u001b[32m{\u001b[0m\u001b[32m\"role\":\"assistant\",\"content\":\"\",\"stop_reason\":\"end_of_turn\",\"tool_calls\":\u001b[0m\u001b[32m[\u001b[0m\u001b[32m{\u001b[0m\u001b[32m\"call_id\":\"19bd3554-e670-4856-89d0-c63f5b016245\",\"tool_name\":\"bravy_search\",\"arguments\":\u001b[0m\u001b[32m{\u001b[0m\u001b[32m\"query\":\"Bill Cosby South Park episode\"\u001b[0m\u001b[32m}\u001b[0m\u001b[32m}\u001b[0m\u001b[32m]\u001b[0m\u001b[32m}\u001b[0m\u001b[32m'\u001b[0m,\n", "\u001b[2;32m│ │ │ \u001b[0m\u001b[32m'\u001b[0m\u001b[32m{\u001b[0m\u001b[32m\"role\":\"user\",\"content\":\"What is the British-American kickboxer Andrew Tate\\'s kickboxing name?\",\"context\":null\u001b[0m\u001b[32m}\u001b[0m\u001b[32m'\u001b[0m\n", "\u001b[2;32m│ │ \u001b[0m\u001b[1m]\u001b[0m,\n", "\u001b[2;32m│ │ \u001b[0m\u001b[32m'output'\u001b[0m: \u001b[32m\"content: tool_calls: \u001b[0m\u001b[32m[\u001b[0m\u001b[32mToolCall\u001b[0m\u001b[32m(\u001b[0m\u001b[32mcall_id\u001b[0m\u001b[32m='526045a7-5f51-40fb-ba97-5ad29610e511', \u001b[0m\u001b[32mtool_name\u001b[0m\u001b[32m=\u001b[0m\u001b[32m<\u001b[0m\u001b[32mBuiltinTool.brave_search:\u001b[0m\u001b[32m 'brave_search'\u001b[0m\u001b[32m>\u001b[0m\u001b[32m, \u001b[0m\u001b[32marguments\u001b[0m\u001b[32m=\u001b[0m\u001b[32m{\u001b[0m\u001b[32m'query': 'Andrew Tate kickboxing name'\u001b[0m\u001b[32m}\u001b[0m\u001b[32m)\u001b[0m\u001b[32m]\u001b[0m\u001b[32m\"\u001b[0m\n", "\u001b[2;32m│ \u001b[0m\u001b[1m}\u001b[0m,\n", "\u001b[2;32m│ \u001b[0m\u001b[1m{\u001b[0m\n", "\u001b[2;32m│ │ \u001b[0m\u001b[32m'input'\u001b[0m: \u001b[32m'\u001b[0m\u001b[32m{\u001b[0m\u001b[32m\"role\":\"assistant\",\"content\":\"\",\"stop_reason\":\"end_of_turn\",\"tool_calls\":\u001b[0m\u001b[32m[\u001b[0m\u001b[32m{\u001b[0m\u001b[32m\"call_id\":\"526045a7-5f51-40fb-ba97-5ad29610e511\",\"tool_name\":\"brave_search\",\"arguments\":\u001b[0m\u001b[32m{\u001b[0m\u001b[32m\"query\":\"Andrew Tate kickboxing name\"\u001b[0m\u001b[32m}\u001b[0m\u001b[32m}\u001b[0m\u001b[32m]\u001b[0m\u001b[32m}\u001b[0m\u001b[32m'\u001b[0m,\n", "\u001b[2;32m│ │ \u001b[0m\u001b[32m'output'\u001b[0m: \u001b[32m'\u001b[0m\u001b[32m{\u001b[0m\u001b[32m\"role\":\"ipython\",\"call_id\":\"526045a7-5f51-40fb-ba97-5ad29610e511\",\"tool_name\":\"brave_search\",\"content\":\"\u001b[0m\u001b[32m{\u001b[0m\u001b[32m\\\\\"query\\\\\": \\\\\"Andrew Tate kickboxing name\\\\\", \\\\\"top_k\\\\\": \u001b[0m\u001b[32m[\u001b[0m\u001b[32m{\u001b[0m\u001b[32m\\\\\"title\\\\\": \\\\\"Andrew Tate kickboxing record: How many championships ... - FirstSportz\\\\\", \\\\\"url\\\\\": \\\\\"https://firstsportz.com/mma-how-many-championships-does-andrew-tate-have/\\\\\", \\\\\"content\\\\\": \\\\\"Andrew Tate\\'s Kickboxing career. During his kickboxing career, he used the nickname \\\\\\\\\\\\\"King Cobra,\\\\\\\\\\\\\" which he currently uses as his Twitter name. Tate had an unorthodox style of movement inside the ring. He kept his hands down most of the time and relied on quick jabs and an overhand right to land significant strikes.\\\\\", \\\\\"score\\\\\": 0.9996244, \\\\\"raw_content\\\\\": null\u001b[0m\u001b[32m}\u001b[0m\u001b[32m, \u001b[0m\u001b[32m{\u001b[0m\u001b[32m\\\\\"title\\\\\": \\\\\"Andrew Tate: Kickboxing Record, Facts, Height, Weight, Age, Biography\\\\\", \\\\\"url\\\\\": \\\\\"https://www.lowkickmma.com/andrew-tate-kickboxing-record-facts-height-weight-age-biography/\\\\\", \\\\\"content\\\\\": \\\\\"Birth Name: Emory Andrew Tate III: Date of Birth: 1 December 1986: Place of Birth: Washington, D.C., U.S. ... In his professional kickboxing career, Andrew Tate won 32 of his fights by knockout.\\\\\", \\\\\"score\\\\\": 0.99909246, \\\\\"raw_content\\\\\": null\u001b[0m\u001b[32m}\u001b[0m\u001b[32m, \u001b[0m\u001b[32m{\u001b[0m\u001b[32m\\\\\"title\\\\\": \\\\\"Who is Andrew Tate? MMA, kickboxing record and controversies of fighter ...\\\\\", \\\\\"url\\\\\": \\\\\"https://www.sportingnews.com/us/kickboxing/news/andrew-tate-mma-kickboxing-record-controversies/u50waalc9cfz7krjg9wnyb7p\\\\\", \\\\\"content\\\\\": \\\\\"Andrew Tate kickboxing record After launching his career as a 20-year-old in 2007, Tate built a formidable kickboxing record that included 76 wins across 85 fights in more than 13 years in the ring.\\\\\", \\\\\"score\\\\\": 0.9976586, \\\\\"raw_content\\\\\": null\u001b[0m\u001b[32m}\u001b[0m\u001b[32m, \u001b[0m\u001b[32m{\u001b[0m\u001b[32m\\\\\"title\\\\\": \\\\\"About Andrew Tate: A Journey from Champion to Controversy\\\\\", \\\\\"url\\\\\": \\\\\"https://reachmorpheus.com/andrew-tate/\\\\\", \\\\\"content\\\\\": \\\\\"Andrew Tate\\'s kickboxing career, beginning in 2005, is a tale of determination and skill. He quickly made a name for himself in the sport, rising through the ranks with his unique fighting style and strategic approach, honed by his chess-playing background.\\\\\", \\\\\"score\\\\\": 0.99701905, \\\\\"raw_content\\\\\": null\u001b[0m\u001b[32m}\u001b[0m\u001b[32m, \u001b[0m\u001b[32m{\u001b[0m\u001b[32m\\\\\"title\\\\\": \\\\\"Andrew Tate Bio, Wiki, Net Worth, Age, Family, MMA Career - Next Biography\\\\\", \\\\\"url\\\\\": \\\\\"https://www.nextbiography.com/andrew-tate/\\\\\", \\\\\"content\\\\\": \\\\\"Andrew Tate Age. Andrew Tate is 36 years old as of 2023, born on December 1, 1986, in Washington, DC. By his mid-thirties, Andrew Tate has become an esteemed figure in the world of kickboxing, showcasing remarkable expertise and experience in the sport. Early Life of Andrew Tate. Andrew Tate was born on 01 December 1986 to an African-American\\\\\", \\\\\"score\\\\\": 0.99368566, \\\\\"raw_content\\\\\": null\u001b[0m\u001b[32m}\u001b[0m\u001b[32m]\u001b[0m\u001b[32m}\u001b[0m\u001b[32m\"\u001b[0m\u001b[32m}\u001b[0m\u001b[32m'\u001b[0m\n", "\u001b[2;32m│ \u001b[0m\u001b[1m}\u001b[0m,\n", "\u001b[2;32m│ \u001b[0m\u001b[1m{\u001b[0m\n", "\u001b[2;32m│ │ \u001b[0m\u001b[32m'input'\u001b[0m: \u001b[1m[\u001b[0m\n", "\u001b[2;32m│ │ │ \u001b[0m\u001b[32m'\u001b[0m\u001b[32m{\u001b[0m\u001b[32m\"role\":\"system\",\"content\":\"You are a helpful assistant. Use search tool to answer the questions. \"\u001b[0m\u001b[32m}\u001b[0m\u001b[32m'\u001b[0m,\n", "\u001b[2;32m│ │ │ \u001b[0m\u001b[32m'\u001b[0m\u001b[32m{\u001b[0m\u001b[32m\"role\":\"user\",\"content\":\"Which teams played in the NBA western conference finals of 2024\",\"context\":null\u001b[0m\u001b[32m}\u001b[0m\u001b[32m'\u001b[0m,\n", "\u001b[2;32m│ │ │ \u001b[0m\u001b[32m'\u001b[0m\u001b[32m{\u001b[0m\u001b[32m\"role\":\"assistant\",\"content\":\"Let me check the latest sports news.\",\"stop_reason\":\"end_of_turn\",\"tool_calls\":\u001b[0m\u001b[32m[\u001b[0m\u001b[32m]\u001b[0m\u001b[32m}\u001b[0m\u001b[32m'\u001b[0m,\n", "\u001b[2;32m│ │ │ \u001b[0m\u001b[32m'\u001b[0m\u001b[32m{\u001b[0m\u001b[32m\"role\":\"user\",\"content\":\"In which episode and season of South Park does Bill Cosby \u001b[0m\u001b[32m(\u001b[0m\u001b[32mBSM-471\u001b[0m\u001b[32m)\u001b[0m\u001b[32m first appear? Give me the number and title.\",\"context\":null\u001b[0m\u001b[32m}\u001b[0m\u001b[32m'\u001b[0m,\n", "\u001b[2;32m│ │ │ \u001b[0m\u001b[32m'\u001b[0m\u001b[32m{\u001b[0m\u001b[32m\"role\":\"assistant\",\"content\":\"\",\"stop_reason\":\"end_of_turn\",\"tool_calls\":\u001b[0m\u001b[32m[\u001b[0m\u001b[32m{\u001b[0m\u001b[32m\"call_id\":\"19bd3554-e670-4856-89d0-c63f5b016245\",\"tool_name\":\"bravy_search\",\"arguments\":\u001b[0m\u001b[32m{\u001b[0m\u001b[32m\"query\":\"Bill Cosby South Park episode\"\u001b[0m\u001b[32m}\u001b[0m\u001b[32m}\u001b[0m\u001b[32m]\u001b[0m\u001b[32m}\u001b[0m\u001b[32m'\u001b[0m,\n", "\u001b[2;32m│ │ │ \u001b[0m\u001b[32m'\u001b[0m\u001b[32m{\u001b[0m\u001b[32m\"role\":\"user\",\"content\":\"What is the British-American kickboxer Andrew Tate\\'s kickboxing name?\",\"context\":null\u001b[0m\u001b[32m}\u001b[0m\u001b[32m'\u001b[0m,\n", "\u001b[2;32m│ │ │ \u001b[0m\u001b[32m'\u001b[0m\u001b[32m{\u001b[0m\u001b[32m\"role\":\"assistant\",\"content\":\"\",\"stop_reason\":\"end_of_turn\",\"tool_calls\":\u001b[0m\u001b[32m[\u001b[0m\u001b[32m{\u001b[0m\u001b[32m\"call_id\":\"526045a7-5f51-40fb-ba97-5ad29610e511\",\"tool_name\":\"brave_search\",\"arguments\":\u001b[0m\u001b[32m{\u001b[0m\u001b[32m\"query\":\"Andrew Tate kickboxing name\"\u001b[0m\u001b[32m}\u001b[0m\u001b[32m}\u001b[0m\u001b[32m]\u001b[0m\u001b[32m}\u001b[0m\u001b[32m'\u001b[0m,\n", "\u001b[2;32m│ │ │ \u001b[0m\u001b[32m'\u001b[0m\u001b[32m{\u001b[0m\u001b[32m\"role\":\"ipython\",\"call_id\":\"526045a7-5f51-40fb-ba97-5ad29610e511\",\"tool_name\":\"brave_search\",\"content\":\"\u001b[0m\u001b[32m{\u001b[0m\u001b[32m\\\\\"query\\\\\": \\\\\"Andrew Tate kickboxing name\\\\\", \\\\\"top_k\\\\\": \u001b[0m\u001b[32m[\u001b[0m\u001b[32m{\u001b[0m\u001b[32m\\\\\"title\\\\\": \\\\\"Andrew Tate kickboxing record: How many championships ... - FirstSportz\\\\\", \\\\\"url\\\\\": \\\\\"https://firstsportz.com/mma-how-many-championships-does-andrew-tate-have/\\\\\", \\\\\"content\\\\\": \\\\\"Andrew Tate\\'s Kickboxing career. During his kickboxing career, he used the nickname \\\\\\\\\\\\\"King Cobra,\\\\\\\\\\\\\" which he currently uses as his Twitter name. Tate had an unorthodox style of movement inside the ring. He kept his hands down most of the time and relied on quick jabs and an overhand right to land significant strikes.\\\\\", \\\\\"score\\\\\": 0.9996244, \\\\\"raw_content\\\\\": null\u001b[0m\u001b[32m}\u001b[0m\u001b[32m, \u001b[0m\u001b[32m{\u001b[0m\u001b[32m\\\\\"title\\\\\": \\\\\"Andrew Tate: Kickboxing Record, Facts, Height, Weight, Age, Biography\\\\\", \\\\\"url\\\\\": \\\\\"https://www.lowkickmma.com/andrew-tate-kickboxing-record-facts-height-weight-age-biography/\\\\\", \\\\\"content\\\\\": \\\\\"Birth Name: Emory Andrew Tate III: Date of Birth: 1 December 1986: Place of Birth: Washington, D.C., U.S. ... In his professional kickboxing career, Andrew Tate won 32 of his fights by knockout.\\\\\", \\\\\"score\\\\\": 0.99909246, \\\\\"raw_content\\\\\": null\u001b[0m\u001b[32m}\u001b[0m\u001b[32m, \u001b[0m\u001b[32m{\u001b[0m\u001b[32m\\\\\"title\\\\\": \\\\\"Who is Andrew Tate? MMA, kickboxing record and controversies of fighter ...\\\\\", \\\\\"url\\\\\": \\\\\"https://www.sportingnews.com/us/kickboxing/news/andrew-tate-mma-kickboxing-record-controversies/u50waalc9cfz7krjg9wnyb7p\\\\\", \\\\\"content\\\\\": \\\\\"Andrew Tate kickboxing record After launching his career as a 20-year-old in 2007, Tate built a formidable kickboxing record that included 76 wins across 85 fights in more than 13 years in the ring.\\\\\", \\\\\"score\\\\\": 0.9976586, \\\\\"raw_content\\\\\": null\u001b[0m\u001b[32m}\u001b[0m\u001b[32m, \u001b[0m\u001b[32m{\u001b[0m\u001b[32m\\\\\"title\\\\\": \\\\\"About Andrew Tate: A Journey from Champion to Controversy\\\\\", \\\\\"url\\\\\": \\\\\"https://reachmorpheus.com/andrew-tate/\\\\\", \\\\\"content\\\\\": \\\\\"Andrew Tate\\'s kickboxing career, beginning in 2005, is a tale of determination and skill. He quickly made a name for himself in the sport, rising through the ranks with his unique fighting style and strategic approach, honed by his chess-playing background.\\\\\", \\\\\"score\\\\\": 0.99701905, \\\\\"raw_content\\\\\": null\u001b[0m\u001b[32m}\u001b[0m\u001b[32m, \u001b[0m\u001b[32m{\u001b[0m\u001b[32m\\\\\"title\\\\\": \\\\\"Andrew Tate Bio, Wiki, Net Worth, Age, Family, MMA Career - Next Biography\\\\\", \\\\\"url\\\\\": \\\\\"https://www.nextbiography.com/andrew-tate/\\\\\", \\\\\"content\\\\\": \\\\\"Andrew Tate Age. Andrew Tate is 36 years old as of 2023, born on December 1, 1986, in Washington, DC. By his mid-thirties, Andrew Tate has become an esteemed figure in the world of kickboxing, showcasing remarkable expertise and experience in the sport. Early Life of Andrew Tate. Andrew Tate was born on 01 December 1986 to an African-American\\\\\", \\\\\"score\\\\\": 0.99368566, \\\\\"raw_content\\\\\": null\u001b[0m\u001b[32m}\u001b[0m\u001b[32m]\u001b[0m\u001b[32m}\u001b[0m\u001b[32m\"\u001b[0m\u001b[32m}\u001b[0m\u001b[32m'\u001b[0m\n", "\u001b[2;32m│ │ \u001b[0m\u001b[1m]\u001b[0m,\n", "\u001b[2;32m│ │ \u001b[0m\u001b[32m'output'\u001b[0m: \u001b[32m'content: Andrew Tate\\'s kickboxing name is \"King Cobra.\" tool_calls: \u001b[0m\u001b[32m[\u001b[0m\u001b[32m]\u001b[0m\u001b[32m'\u001b[0m\n", "\u001b[2;32m│ \u001b[0m\u001b[1m}\u001b[0m\n", "\u001b[1m]\u001b[0m\n" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "print(f\"Getting traces for session_id={session_id}\")\n", "import json\n", "from rich.pretty import pprint\n", "\n", "agent_logs = []\n", "\n", "for span in client.telemetry.query_spans(\n", " attribute_filters=[\n", " {\"key\": \"session_id\", \"op\": \"eq\", \"value\": session_id},\n", " ],\n", " attributes_to_return=[\"input\", \"output\"]\n", " ):\n", " if span.attributes[\"output\"] != \"no shields\":\n", " agent_logs.append(span.attributes)\n", "\n", "pprint(agent_logs)" ] }, { "cell_type": "markdown", "id": "QF30H7ufP2RE", "metadata": { "id": "QF30H7ufP2RE" }, "source": [ "##### 3.1.3 Post-Process Telemetry Results & Evaluate\n", "\n", "- Now, we want to run evaluation to assert that our search agent succesfully calls brave_search from online traces.\n", "- We will first post-process the agent's telemetry logs and run evaluation." ] }, { "cell_type": "code", "execution_count": null, "id": "sy4Xaff_Avuu", "metadata": { "colab": { "base_uri": "https://localhost:8080/", "height": 411 }, "id": "sy4Xaff_Avuu", "outputId": "cb68bae7-b21d-415d-8e71-612bd383c793" }, "outputs": [ { "data": { "text/html": [ "
[\n", "│ {\n", "│ │ 'input_query': '{\"role\":\"user\",\"content\":\"Which teams played in the NBA western conference finals of 2024\",\"context\":null}',\n", "│ │ 'generated_answer': 'content: Let me check the latest sports news. tool_calls: []',\n", "│ │ 'expected_answer': 'brave_search'\n", "│ },\n", "│ {\n", "│ │ 'input_query': '{\"role\":\"user\",\"content\":\"In which episode and season of South Park does Bill Cosby (BSM-471) first appear? Give me the number and title.\",\"context\":null}',\n", "│ │ 'generated_answer': \"content: tool_calls: [ToolCall(call_id='19bd3554-e670-4856-89d0-c63f5b016245', tool_name='bravy_search', arguments={'query': 'Bill Cosby South Park episode'})]\",\n", "│ │ 'expected_answer': 'brave_search'\n", "│ },\n", "│ {\n", "│ │ 'input_query': '{\"role\":\"user\",\"content\":\"What is the British-American kickboxer Andrew Tate\\'s kickboxing name?\",\"context\":null}',\n", "│ │ 'generated_answer': \"content: tool_calls: [ToolCall(call_id='526045a7-5f51-40fb-ba97-5ad29610e511', tool_name=<BuiltinTool.brave_search: 'brave_search'>, arguments={'query': 'Andrew Tate kickboxing name'})]\",\n", "│ │ 'expected_answer': 'brave_search'\n", "│ }\n", "]\n", "\n" ], "text/plain": [ "\u001b[1m[\u001b[0m\n", "\u001b[2;32m│ \u001b[0m\u001b[1m{\u001b[0m\n", "\u001b[2;32m│ │ \u001b[0m\u001b[32m'input_query'\u001b[0m: \u001b[32m'\u001b[0m\u001b[32m{\u001b[0m\u001b[32m\"role\":\"user\",\"content\":\"Which teams played in the NBA western conference finals of 2024\",\"context\":null\u001b[0m\u001b[32m}\u001b[0m\u001b[32m'\u001b[0m,\n", "\u001b[2;32m│ │ \u001b[0m\u001b[32m'generated_answer'\u001b[0m: \u001b[32m'content: Let me check the latest sports news. tool_calls: \u001b[0m\u001b[32m[\u001b[0m\u001b[32m]\u001b[0m\u001b[32m'\u001b[0m,\n", "\u001b[2;32m│ │ \u001b[0m\u001b[32m'expected_answer'\u001b[0m: \u001b[32m'brave_search'\u001b[0m\n", "\u001b[2;32m│ \u001b[0m\u001b[1m}\u001b[0m,\n", "\u001b[2;32m│ \u001b[0m\u001b[1m{\u001b[0m\n", "\u001b[2;32m│ │ \u001b[0m\u001b[32m'input_query'\u001b[0m: \u001b[32m'\u001b[0m\u001b[32m{\u001b[0m\u001b[32m\"role\":\"user\",\"content\":\"In which episode and season of South Park does Bill Cosby \u001b[0m\u001b[32m(\u001b[0m\u001b[32mBSM-471\u001b[0m\u001b[32m)\u001b[0m\u001b[32m first appear? Give me the number and title.\",\"context\":null\u001b[0m\u001b[32m}\u001b[0m\u001b[32m'\u001b[0m,\n", "\u001b[2;32m│ │ \u001b[0m\u001b[32m'generated_answer'\u001b[0m: \u001b[32m\"content: tool_calls: \u001b[0m\u001b[32m[\u001b[0m\u001b[32mToolCall\u001b[0m\u001b[32m(\u001b[0m\u001b[32mcall_id\u001b[0m\u001b[32m='19bd3554-e670-4856-89d0-c63f5b016245', \u001b[0m\u001b[32mtool_name\u001b[0m\u001b[32m='bravy_search', \u001b[0m\u001b[32marguments\u001b[0m\u001b[32m=\u001b[0m\u001b[32m{\u001b[0m\u001b[32m'query': 'Bill Cosby South Park episode'\u001b[0m\u001b[32m}\u001b[0m\u001b[32m)\u001b[0m\u001b[32m]\u001b[0m\u001b[32m\"\u001b[0m,\n", "\u001b[2;32m│ │ \u001b[0m\u001b[32m'expected_answer'\u001b[0m: \u001b[32m'brave_search'\u001b[0m\n", "\u001b[2;32m│ \u001b[0m\u001b[1m}\u001b[0m,\n", "\u001b[2;32m│ \u001b[0m\u001b[1m{\u001b[0m\n", "\u001b[2;32m│ │ \u001b[0m\u001b[32m'input_query'\u001b[0m: \u001b[32m'\u001b[0m\u001b[32m{\u001b[0m\u001b[32m\"role\":\"user\",\"content\":\"What is the British-American kickboxer Andrew Tate\\'s kickboxing name?\",\"context\":null\u001b[0m\u001b[32m}\u001b[0m\u001b[32m'\u001b[0m,\n", "\u001b[2;32m│ │ \u001b[0m\u001b[32m'generated_answer'\u001b[0m: \u001b[32m\"content: tool_calls: \u001b[0m\u001b[32m[\u001b[0m\u001b[32mToolCall\u001b[0m\u001b[32m(\u001b[0m\u001b[32mcall_id\u001b[0m\u001b[32m='526045a7-5f51-40fb-ba97-5ad29610e511', \u001b[0m\u001b[32mtool_name\u001b[0m\u001b[32m=\u001b[0m\u001b[32m<\u001b[0m\u001b[32mBuiltinTool.brave_search:\u001b[0m\u001b[32m 'brave_search'\u001b[0m\u001b[32m>\u001b[0m\u001b[32m, \u001b[0m\u001b[32marguments\u001b[0m\u001b[32m=\u001b[0m\u001b[32m{\u001b[0m\u001b[32m'query': 'Andrew Tate kickboxing name'\u001b[0m\u001b[32m}\u001b[0m\u001b[32m)\u001b[0m\u001b[32m]\u001b[0m\u001b[32m\"\u001b[0m,\n", "\u001b[2;32m│ │ \u001b[0m\u001b[32m'expected_answer'\u001b[0m: \u001b[32m'brave_search'\u001b[0m\n", "\u001b[2;32m│ \u001b[0m\u001b[1m}\u001b[0m\n", "\u001b[1m]\u001b[0m\n" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "text/html": [ "
ScoringScoreResponse(\n", "│ results={\n", "│ │ 'basic::subset_of': ScoringResult(\n", "│ │ │ aggregated_results={'accuracy': {'accuracy': 0.3333333333333333, 'num_correct': 1.0, 'num_total': 3}},\n", "│ │ │ score_rows=[{'score': 0.0}, {'score': 0.0}, {'score': 1.0}]\n", "│ │ )\n", "│ }\n", ")\n", "\n" ], "text/plain": [ "\u001b[1;35mScoringScoreResponse\u001b[0m\u001b[1m(\u001b[0m\n", "\u001b[2;32m│ \u001b[0m\u001b[33mresults\u001b[0m=\u001b[1m{\u001b[0m\n", "\u001b[2;32m│ │ \u001b[0m\u001b[32m'basic::subset_of'\u001b[0m: \u001b[1;35mScoringResult\u001b[0m\u001b[1m(\u001b[0m\n", "\u001b[2;32m│ │ │ \u001b[0m\u001b[33maggregated_results\u001b[0m=\u001b[1m{\u001b[0m\u001b[32m'accuracy'\u001b[0m: \u001b[1m{\u001b[0m\u001b[32m'accuracy'\u001b[0m: \u001b[1;36m0.3333333333333333\u001b[0m, \u001b[32m'num_correct'\u001b[0m: \u001b[1;36m1.0\u001b[0m, \u001b[32m'num_total'\u001b[0m: \u001b[1;36m3\u001b[0m\u001b[1m}\u001b[0m\u001b[1m}\u001b[0m,\n", "\u001b[2;32m│ │ │ \u001b[0m\u001b[33mscore_rows\u001b[0m=\u001b[1m[\u001b[0m\u001b[1m{\u001b[0m\u001b[32m'score'\u001b[0m: \u001b[1;36m0.0\u001b[0m\u001b[1m}\u001b[0m, \u001b[1m{\u001b[0m\u001b[32m'score'\u001b[0m: \u001b[1;36m0.0\u001b[0m\u001b[1m}\u001b[0m, \u001b[1m{\u001b[0m\u001b[32m'score'\u001b[0m: \u001b[1;36m1.0\u001b[0m\u001b[1m}\u001b[0m\u001b[1m]\u001b[0m\n", "\u001b[2;32m│ │ \u001b[0m\u001b[1m)\u001b[0m\n", "\u001b[2;32m│ \u001b[0m\u001b[1m}\u001b[0m\n", "\u001b[1m)\u001b[0m\n" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "# post-process telemetry spance and prepare data for eval\n", "# in this case, we want to assert that all user prompts is followed by a tool call\n", "import ast\n", "import json\n", "\n", "eval_rows = []\n", "\n", "for log in agent_logs:\n", " last_msg = log['input'][-1]\n", " if \"\\\"role\\\":\\\"user\\\"\" in last_msg:\n", " eval_rows.append(\n", " {\n", " \"input_query\": last_msg,\n", " \"generated_answer\": log[\"output\"],\n", " # check if generated_answer uses tools brave_search\n", " \"expected_answer\": \"brave_search\",\n", " },\n", " )\n", "\n", "pprint(eval_rows)\n", "scoring_params = {\n", " \"basic::subset_of\": None,\n", "}\n", "scoring_response = client.scoring.score(input_rows=eval_rows, scoring_functions=scoring_params)\n", "pprint(scoring_response)" ] }, { "cell_type": "markdown", "id": "IKbzhxcw5e_c", "metadata": { "id": "IKbzhxcw5e_c" }, "source": [ "#### 3.2. Agentic Application Dataset Scoring\n", "- Llama Stack offers a library of scoring functions and the `/scoring` API, allowing you to run evaluations on your pre-annotated AI application datasets.\n", "\n", "- In this example, we will work with an example RAG dataset you have built previously, label with an annotation, and use LLM-As-Judge with custom judge prompt for scoring. Please checkout our [Llama Stack Playground](https://llama-stack.readthedocs.io/en/latest/playground/index.html) for an interactive interface to upload datasets and run scorings." ] }, { "cell_type": "code", "execution_count": null, "id": "xG4Y84VQBb0g", "metadata": { "colab": { "base_uri": "https://localhost:8080/", "height": 298 }, "id": "xG4Y84VQBb0g", "outputId": "f61cebdf-f614-440c-d170-f1e873b542ef" }, "outputs": [ { "data": { "text/html": [ "
ScoringScoreResponse(\n", "│ results={\n", "│ │ 'llm-as-judge::base': ScoringResult(\n", "│ │ │ aggregated_results={},\n", "│ │ │ score_rows=[\n", "│ │ │ │ {\n", "│ │ │ │ │ 'score': 'B',\n", "│ │ │ │ │ 'judge_feedback': 'Answer: B, Explanation: The GENERATED_RESPONSE is a superset of the EXPECTED_RESPONSE and is fully consistent with it. The GENERATED_RESPONSE provides more detailed information about the top 5 topics related to LoRA, while the EXPECTED_RESPONSE only mentions \"LoRA\". The GENERATED_RESPONSE expands on the topic, but does not conflict with the EXPECTED_RESPONSE.'\n", "│ │ │ │ }\n", "│ │ │ ]\n", "│ │ ),\n", "│ │ 'basic::subset_of': ScoringResult(\n", "│ │ │ aggregated_results={'accuracy': 1.0, 'num_correct': 1.0, 'num_total': 1.0},\n", "│ │ │ score_rows=[{'score': 1.0}]\n", "│ │ )\n", "│ }\n", ")\n", "\n" ], "text/plain": [ "\u001b[1;35mScoringScoreResponse\u001b[0m\u001b[1m(\u001b[0m\n", "\u001b[2;32m│ \u001b[0m\u001b[33mresults\u001b[0m=\u001b[1m{\u001b[0m\n", "\u001b[2;32m│ │ \u001b[0m\u001b[32m'llm-as-judge::base'\u001b[0m: \u001b[1;35mScoringResult\u001b[0m\u001b[1m(\u001b[0m\n", "\u001b[2;32m│ │ │ \u001b[0m\u001b[33maggregated_results\u001b[0m=\u001b[1m{\u001b[0m\u001b[1m}\u001b[0m,\n", "\u001b[2;32m│ │ │ \u001b[0m\u001b[33mscore_rows\u001b[0m=\u001b[1m[\u001b[0m\n", "\u001b[2;32m│ │ │ │ \u001b[0m\u001b[1m{\u001b[0m\n", "\u001b[2;32m│ │ │ │ │ \u001b[0m\u001b[32m'score'\u001b[0m: \u001b[32m'B'\u001b[0m,\n", "\u001b[2;32m│ │ │ │ │ \u001b[0m\u001b[32m'judge_feedback'\u001b[0m: \u001b[32m'Answer: B, Explanation: The GENERATED_RESPONSE is a superset of the EXPECTED_RESPONSE and is fully consistent with it. The GENERATED_RESPONSE provides more detailed information about the top 5 topics related to LoRA, while the EXPECTED_RESPONSE only mentions \"LoRA\". The GENERATED_RESPONSE expands on the topic, but does not conflict with the EXPECTED_RESPONSE.'\u001b[0m\n", "\u001b[2;32m│ │ │ │ \u001b[0m\u001b[1m}\u001b[0m\n", "\u001b[2;32m│ │ │ \u001b[0m\u001b[1m]\u001b[0m\n", "\u001b[2;32m│ │ \u001b[0m\u001b[1m)\u001b[0m,\n", "\u001b[2;32m│ │ \u001b[0m\u001b[32m'basic::subset_of'\u001b[0m: \u001b[1;35mScoringResult\u001b[0m\u001b[1m(\u001b[0m\n", "\u001b[2;32m│ │ │ \u001b[0m\u001b[33maggregated_results\u001b[0m=\u001b[1m{\u001b[0m\u001b[32m'accuracy'\u001b[0m: \u001b[1;36m1.0\u001b[0m, \u001b[32m'num_correct'\u001b[0m: \u001b[1;36m1.0\u001b[0m, \u001b[32m'num_total'\u001b[0m: \u001b[1;36m1.0\u001b[0m\u001b[1m}\u001b[0m,\n", "\u001b[2;32m│ │ │ \u001b[0m\u001b[33mscore_rows\u001b[0m=\u001b[1m[\u001b[0m\u001b[1m{\u001b[0m\u001b[32m'score'\u001b[0m: \u001b[1;36m1.0\u001b[0m\u001b[1m}\u001b[0m\u001b[1m]\u001b[0m\n", "\u001b[2;32m│ │ \u001b[0m\u001b[1m)\u001b[0m\n", "\u001b[2;32m│ \u001b[0m\u001b[1m}\u001b[0m\n", "\u001b[1m)\u001b[0m\n" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "import rich\n", "from rich.pretty import pprint\n", "\n", "judge_model_id = \"meta-llama/Llama-3.1-405B-Instruct-FP8\"\n", "\n", "JUDGE_PROMPT = \"\"\"\n", "Given a QUESTION and GENERATED_RESPONSE and EXPECTED_RESPONSE.\n", "\n", "Compare the factual content of the GENERATED_RESPONSE with the EXPECTED_RESPONSE. Ignore any differences in style, grammar, or punctuation.\n", " The GENERATED_RESPONSE may either be a subset or superset of the EXPECTED_RESPONSE, or it may conflict with it. Determine which case applies. Answer the question by selecting one of the following options:\n", " (A) The GENERATED_RESPONSE is a subset of the EXPECTED_RESPONSE and is fully consistent with it.\n", " (B) The GENERATED_RESPONSE is a superset of the EXPECTED_RESPONSE and is fully consistent with it.\n", " (C) The GENERATED_RESPONSE contains all the same details as the EXPECTED_RESPONSE.\n", " (D) There is a disagreement between the GENERATED_RESPONSE and the EXPECTED_RESPONSE.\n", " (E) The answers differ, but these differences don't matter from the perspective of factuality.\n", "\n", "Give your answer in the format \"Answer: One of ABCDE, Explanation: \".\n", "\n", "Your actual task:\n", "\n", "QUESTION: {input_query}\n", "GENERATED_RESPONSE: {generated_answer}\n", "EXPECTED_RESPONSE: {expected_answer}\n", "\"\"\"\n", "\n", "input_query = \"What are the top 5 topics that were explained? Only list succinct bullet points.\"\n", "generated_answer = \"\"\"\n", "Here are the top 5 topics that were explained in the documentation for Torchtune:\n", "\n", "* What is LoRA and how does it work?\n", "* Fine-tuning with LoRA: memory savings and parameter-efficient finetuning\n", "* Running a LoRA finetune with Torchtune: overview and recipe\n", "* Experimenting with different LoRA configurations: rank, alpha, and attention modules\n", "* LoRA finetuning\n", "\"\"\"\n", "expected_answer = \"\"\"LoRA\"\"\"\n", "\n", "rows = [\n", " {\n", " \"input_query\": input_query,\n", " \"generated_answer\": generated_answer,\n", " \"expected_answer\": expected_answer,\n", " },\n", "]\n", "\n", "scoring_params = {\n", " \"llm-as-judge::base\": {\n", " \"judge_model\": judge_model_id,\n", " \"prompt_template\": JUDGE_PROMPT,\n", " \"type\": \"llm_as_judge\",\n", " \"judge_score_regexes\": [\"Answer: (A|B|C|D|E)\"],\n", " },\n", " \"basic::subset_of\": None,\n", "}\n", "\n", "response = client.scoring.score(input_rows=rows, scoring_functions=scoring_params)\n", "pprint(response)" ] }, { "cell_type": "code", "execution_count": null, "id": "rKtGo_v98UA2", "metadata": { "id": "rKtGo_v98UA2" }, "outputs": [], "source": [] } ], "metadata": { "colab": { "collapsed_sections": [ "_JueJAKyJR5m" ], "provenance": [] }, "kernelspec": { "display_name": "Python 3", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.10.15" }, "widgets": { "application/vnd.jupyter.widget-state+json": { "0243626d7ef44ef2b90e8fed5c13183d": { "model_module": "@jupyter-widgets/controls", "model_module_version": "1.5.0", "model_name": "DescriptionStyleModel", "state": { "_model_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_model_name": "DescriptionStyleModel", "_view_count": null, "_view_module": "@jupyter-widgets/base", "_view_module_version": "1.2.0", "_view_name": "StyleView", "description_width": "" } }, "044d6d8dda1c4935b1752a9c71c6ee4a": { "model_module": "@jupyter-widgets/controls", "model_module_version": "1.5.0", "model_name": "FloatProgressModel", "state": { "_dom_classes": [], "_model_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_model_name": "FloatProgressModel", "_view_count": null, "_view_module": "@jupyter-widgets/controls", "_view_module_version": "1.5.0", "_view_name": "ProgressView", "bar_style": "success", "description": "", "description_tooltip": null, "layout": "IPY_MODEL_63f34c3d43bb4fdd9faeb6161fd77285", "max": 1, "min": 0, "orientation": "horizontal", "style": "IPY_MODEL_5cb841b49eaa429e8616ec4b78f501e9", "value": 1 } }, "0640b57408644741970dd958ca0e21e6": { "model_module": "@jupyter-widgets/controls", "model_module_version": "1.5.0", "model_name": "HBoxModel", "state": { "_dom_classes": [], "_model_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_model_name": "HBoxModel", "_view_count": null, "_view_module": "@jupyter-widgets/controls", "_view_module_version": "1.5.0", "_view_name": "HBoxView", "box_style": "", "children": [ "IPY_MODEL_6259ffc3ef674df985fd3fa4334f9c8e", "IPY_MODEL_3d0376d2e574410eb4ef963d51cac0a6", "IPY_MODEL_b66984cc5de541a5801a1e6e54d40daf" ], "layout": "IPY_MODEL_92135b9cb201475681ee0886887c84a8" } }, "116139bfe7a44f969a2c97490c224d31": { "model_module": "@jupyter-widgets/controls", "model_module_version": "1.5.0", "model_name": "HTMLModel", "state": { "_dom_classes": [], "_model_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_model_name": "HTMLModel", "_view_count": null, "_view_module": "@jupyter-widgets/controls", "_view_module_version": "1.5.0", "_view_name": "HTMLView", "description": "", "description_tooltip": null, "layout": "IPY_MODEL_ab1f339cba094c918fc5507f8361de5c", "placeholder": "", "style": "IPY_MODEL_a6a1eb412f204578b80e5b6717c1e3a5", "value": " 1/1 [00:01<00:00, 1.27s/it]" } }, "118b359b83304ae59fad57e28f621645": { "model_module": "@jupyter-widgets/controls", "model_module_version": "1.5.0", "model_name": "ProgressStyleModel", "state": { "_model_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_model_name": "ProgressStyleModel", "_view_count": null, "_view_module": "@jupyter-widgets/base", "_view_module_version": "1.2.0", "_view_name": "StyleView", "bar_color": null, "description_width": "" } }, "15d3ff07f1c54e58b51d452caca01209": { "model_module": "@jupyter-widgets/controls", "model_module_version": "1.5.0", "model_name": "DescriptionStyleModel", "state": { "_model_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_model_name": "DescriptionStyleModel", "_view_count": null, "_view_module": "@jupyter-widgets/base", "_view_module_version": "1.2.0", "_view_name": "StyleView", "description_width": "" } }, "17603dd7fedf4798a74533fbfd5bb421": { "model_module": "@jupyter-widgets/base", "model_module_version": "1.2.0", "model_name": "LayoutModel", "state": { "_model_module": "@jupyter-widgets/base", "_model_module_version": "1.2.0", "_model_name": "LayoutModel", "_view_count": null, "_view_module": "@jupyter-widgets/base", "_view_module_version": "1.2.0", "_view_name": "LayoutView", "align_content": null, "align_items": null, "align_self": null, "border": null, "bottom": null, "display": null, "flex": null, "flex_flow": null, "grid_area": null, "grid_auto_columns": null, "grid_auto_flow": null, "grid_auto_rows": null, "grid_column": null, "grid_gap": null, "grid_row": null, "grid_template_areas": null, "grid_template_columns": null, "grid_template_rows": null, "height": null, "justify_content": null, "justify_items": null, "left": null, "margin": null, "max_height": null, "max_width": null, "min_height": null, "min_width": null, "object_fit": null, "object_position": null, "order": null, "overflow": null, "overflow_x": null, "overflow_y": null, "padding": null, "right": null, "top": null, "visibility": null, "width": null } }, "186682be50c148c0826fa7c314087562": { "model_module": "@jupyter-widgets/controls", "model_module_version": "1.5.0", "model_name": "HTMLModel", "state": { "_dom_classes": [], "_model_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_model_name": "HTMLModel", "_view_count": null, "_view_module": "@jupyter-widgets/controls", "_view_module_version": "1.5.0", "_view_name": "HTMLView", "description": "", "description_tooltip": null, "layout": "IPY_MODEL_1f427d4273e04e19b1bdb13388736c01", "placeholder": "", "style": "IPY_MODEL_38897429b7cf4077aea3a981593ca866", "value": " 1/1 [00:00<00:00, 15.09it/s]" } }, "1f427d4273e04e19b1bdb13388736c01": { "model_module": "@jupyter-widgets/base", "model_module_version": "1.2.0", "model_name": "LayoutModel", "state": { "_model_module": "@jupyter-widgets/base", "_model_module_version": "1.2.0", "_model_name": "LayoutModel", "_view_count": null, "_view_module": "@jupyter-widgets/base", "_view_module_version": "1.2.0", "_view_name": "LayoutView", "align_content": null, "align_items": null, "align_self": null, "border": null, "bottom": null, "display": null, "flex": null, "flex_flow": null, "grid_area": null, "grid_auto_columns": null, "grid_auto_flow": null, "grid_auto_rows": null, "grid_column": null, "grid_gap": null, "grid_row": null, "grid_template_areas": null, "grid_template_columns": null, "grid_template_rows": null, "height": null, "justify_content": null, "justify_items": null, "left": null, "margin": null, "max_height": null, "max_width": null, "min_height": null, "min_width": null, "object_fit": null, "object_position": null, "order": null, "overflow": null, "overflow_x": null, "overflow_y": null, "padding": null, "right": null, "top": null, "visibility": null, "width": null } }, "2082554eed6644a996f0e31545789e08": { "model_module": "@jupyter-widgets/controls", "model_module_version": "1.5.0", "model_name": "HBoxModel", "state": { "_dom_classes": [], "_model_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_model_name": "HBoxModel", "_view_count": null, "_view_module": "@jupyter-widgets/controls", "_view_module_version": "1.5.0", "_view_name": "HBoxView", "box_style": "", "children": [ "IPY_MODEL_a0be415018644c3cac098ab9b19c2391", "IPY_MODEL_6ede3649e8c24015b3ca77490568bfcd", "IPY_MODEL_116139bfe7a44f969a2c97490c224d31" ], "layout": "IPY_MODEL_243d13828d854880a6adb861ea867734" } }, "2100363a158b4488a58620983aa5bdd4": { "model_module": "@jupyter-widgets/controls", "model_module_version": "1.5.0", "model_name": "DescriptionStyleModel", "state": { "_model_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_model_name": "DescriptionStyleModel", "_view_count": null, "_view_module": "@jupyter-widgets/base", "_view_module_version": "1.2.0", "_view_name": "StyleView", "description_width": "" } }, "243d13828d854880a6adb861ea867734": { "model_module": "@jupyter-widgets/base", "model_module_version": "1.2.0", "model_name": "LayoutModel", "state": { "_model_module": "@jupyter-widgets/base", "_model_module_version": "1.2.0", "_model_name": "LayoutModel", "_view_count": null, "_view_module": "@jupyter-widgets/base", "_view_module_version": "1.2.0", "_view_name": "LayoutView", "align_content": null, "align_items": null, "align_self": null, "border": null, "bottom": null, "display": null, "flex": null, "flex_flow": null, "grid_area": null, "grid_auto_columns": null, "grid_auto_flow": null, "grid_auto_rows": null, "grid_column": null, "grid_gap": null, "grid_row": null, "grid_template_areas": null, "grid_template_columns": null, "grid_template_rows": null, "height": null, "justify_content": null, "justify_items": null, "left": null, "margin": null, "max_height": null, "max_width": null, "min_height": null, "min_width": null, "object_fit": null, "object_position": null, "order": null, "overflow": null, "overflow_x": null, "overflow_y": null, "padding": null, "right": null, "top": null, "visibility": null, "width": null } }, "277101c35a784e6caf455a13cd9b8e59": { "model_module": "@jupyter-widgets/base", "model_module_version": "1.2.0", "model_name": "LayoutModel", "state": { "_model_module": "@jupyter-widgets/base", "_model_module_version": "1.2.0", "_model_name": "LayoutModel", "_view_count": null, "_view_module": "@jupyter-widgets/base", "_view_module_version": "1.2.0", "_view_name": "LayoutView", "align_content": null, "align_items": null, "align_self": null, "border": null, "bottom": null, "display": null, "flex": null, "flex_flow": null, "grid_area": null, "grid_auto_columns": null, "grid_auto_flow": null, "grid_auto_rows": null, "grid_column": null, "grid_gap": null, "grid_row": null, "grid_template_areas": null, "grid_template_columns": null, "grid_template_rows": null, "height": null, "justify_content": null, "justify_items": null, "left": null, "margin": null, "max_height": null, "max_width": null, "min_height": null, "min_width": null, "object_fit": null, "object_position": null, "order": null, "overflow": null, "overflow_x": null, "overflow_y": null, "padding": null, "right": null, "top": null, "visibility": null, "width": null } }, "2924814bab5748ddbeeedc70d324195e": { "model_module": "@jupyter-widgets/controls", "model_module_version": "1.5.0", "model_name": "HBoxModel", "state": { "_dom_classes": [], "_model_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_model_name": "HBoxModel", "_view_count": null, "_view_module": "@jupyter-widgets/controls", "_view_module_version": "1.5.0", "_view_name": "HBoxView", "box_style": "", "children": [ "IPY_MODEL_4738bccc6b384da5a20a8bcd61ecec59", "IPY_MODEL_044d6d8dda1c4935b1752a9c71c6ee4a", "IPY_MODEL_9277709ad9154d7b8f37d08db84ee425" ], "layout": "IPY_MODEL_f3f1f2487d6f455caeb6ec71a2d51ee2" } }, "2958af7c9cdb46038e0336d6b7c6773e": { "model_module": "@jupyter-widgets/controls", "model_module_version": "1.5.0", "model_name": "DescriptionStyleModel", "state": { "_model_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_model_name": "DescriptionStyleModel", "_view_count": null, "_view_module": "@jupyter-widgets/base", "_view_module_version": "1.2.0", "_view_name": "StyleView", "description_width": "" } }, "351928faa62543128e0bd29bf89bbf79": { "model_module": "@jupyter-widgets/controls", "model_module_version": "1.5.0", "model_name": "DescriptionStyleModel", "state": { "_model_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_model_name": "DescriptionStyleModel", "_view_count": null, "_view_module": "@jupyter-widgets/base", "_view_module_version": "1.2.0", "_view_name": "StyleView", "description_width": "" } }, "38897429b7cf4077aea3a981593ca866": { "model_module": "@jupyter-widgets/controls", "model_module_version": "1.5.0", "model_name": "DescriptionStyleModel", "state": { "_model_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_model_name": "DescriptionStyleModel", "_view_count": null, "_view_module": "@jupyter-widgets/base", "_view_module_version": "1.2.0", "_view_name": "StyleView", "description_width": "" } }, "3978f618c4f8467eb83c63a8f5aef98a": { "model_module": "@jupyter-widgets/controls", "model_module_version": "1.5.0", "model_name": "ProgressStyleModel", "state": { "_model_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_model_name": "ProgressStyleModel", "_view_count": null, "_view_module": "@jupyter-widgets/base", "_view_module_version": "1.2.0", "_view_name": "StyleView", "bar_color": null, "description_width": "" } }, "3d0376d2e574410eb4ef963d51cac0a6": { "model_module": "@jupyter-widgets/controls", "model_module_version": "1.5.0", "model_name": "FloatProgressModel", "state": { "_dom_classes": [], "_model_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_model_name": "FloatProgressModel", "_view_count": null, "_view_module": "@jupyter-widgets/controls", "_view_module_version": "1.5.0", "_view_name": "ProgressView", "bar_style": "success", "description": "", "description_tooltip": null, "layout": "IPY_MODEL_9054d3825edb49cb9c35d24023f50c03", "max": 1, "min": 0, "orientation": "horizontal", "style": "IPY_MODEL_3978f618c4f8467eb83c63a8f5aef98a", "value": 1 } }, "425c6c0eaed741669551b9af77096c6f": { "model_module": "@jupyter-widgets/controls", "model_module_version": "1.5.0", "model_name": "HBoxModel", "state": { "_dom_classes": [], "_model_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_model_name": "HBoxModel", "_view_count": null, "_view_module": "@jupyter-widgets/controls", "_view_module_version": "1.5.0", "_view_name": "HBoxView", "box_style": "", "children": [ "IPY_MODEL_d124b09896934d289df649375f455a8e", "IPY_MODEL_554cff1a83d44bd2bbd36fd43acac7e2", "IPY_MODEL_d0381718fc8b49a6ac7e7fe85cabba90" ], "layout": "IPY_MODEL_fd3daaf9093d45d8a9d39b87835f4582" } }, "457374ae3035496eb943ad21484f76a0": { "model_module": "@jupyter-widgets/controls", "model_module_version": "1.5.0", "model_name": "HBoxModel", "state": { "_dom_classes": [], "_model_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_model_name": "HBoxModel", "_view_count": null, "_view_module": "@jupyter-widgets/controls", "_view_module_version": "1.5.0", "_view_name": "HBoxView", "box_style": "", "children": [ "IPY_MODEL_bcf4679dda2d4767a0a24cbf236ca76e", "IPY_MODEL_6e4ce98853c84beca11471e7ea9d97df", "IPY_MODEL_186682be50c148c0826fa7c314087562" ], "layout": "IPY_MODEL_e1ef246e3e6c4359b7b61c341119e121" } }, "45b569d733f944d29cefae8a5d13b215": { "model_module": "@jupyter-widgets/base", "model_module_version": "1.2.0", "model_name": "LayoutModel", "state": { "_model_module": "@jupyter-widgets/base", "_model_module_version": "1.2.0", "_model_name": "LayoutModel", "_view_count": null, "_view_module": "@jupyter-widgets/base", "_view_module_version": "1.2.0", "_view_name": "LayoutView", "align_content": null, "align_items": null, "align_self": null, "border": null, "bottom": null, "display": null, "flex": null, "flex_flow": null, "grid_area": null, "grid_auto_columns": null, "grid_auto_flow": null, "grid_auto_rows": null, "grid_column": null, "grid_gap": null, "grid_row": null, "grid_template_areas": null, "grid_template_columns": null, "grid_template_rows": null, "height": null, "justify_content": null, "justify_items": null, "left": null, "margin": null, "max_height": null, "max_width": null, "min_height": null, "min_width": null, "object_fit": null, "object_position": null, "order": null, "overflow": null, "overflow_x": null, "overflow_y": null, "padding": null, "right": null, "top": null, "visibility": null, "width": null } }, "4738bccc6b384da5a20a8bcd61ecec59": { "model_module": "@jupyter-widgets/controls", "model_module_version": "1.5.0", "model_name": "HTMLModel", "state": { "_dom_classes": [], "_model_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_model_name": "HTMLModel", "_view_count": null, "_view_module": "@jupyter-widgets/controls", "_view_module_version": "1.5.0", "_view_name": "HTMLView", "description": "", "description_tooltip": null, "layout": "IPY_MODEL_66c92a8a89234a61a8c688cf1c3e29a1", "placeholder": "", "style": "IPY_MODEL_ee1f4a0c85e44a3b849283337743a8d4", "value": "Batches: 100%" } }, "4a405d391b974e58a2c4fe00d4bb5815": { "model_module": "@jupyter-widgets/base", "model_module_version": "1.2.0", "model_name": "LayoutModel", "state": { "_model_module": "@jupyter-widgets/base", "_model_module_version": "1.2.0", "_model_name": "LayoutModel", "_view_count": null, "_view_module": "@jupyter-widgets/base", "_view_module_version": "1.2.0", "_view_name": "LayoutView", "align_content": null, "align_items": null, "align_self": null, "border": null, "bottom": null, "display": null, "flex": null, "flex_flow": null, "grid_area": null, "grid_auto_columns": null, "grid_auto_flow": null, "grid_auto_rows": null, "grid_column": null, "grid_gap": null, "grid_row": null, "grid_template_areas": null, "grid_template_columns": null, "grid_template_rows": null, "height": null, "justify_content": null, "justify_items": null, "left": null, "margin": null, "max_height": null, "max_width": null, "min_height": null, "min_width": null, "object_fit": null, "object_position": null, "order": null, "overflow": null, "overflow_x": null, "overflow_y": null, "padding": null, "right": null, "top": null, "visibility": null, "width": null } }, "4ad57f5d8a824afab639e8606ee43ca6": { "model_module": "@jupyter-widgets/controls", "model_module_version": "1.5.0", "model_name": "DescriptionStyleModel", "state": { "_model_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_model_name": "DescriptionStyleModel", "_view_count": null, "_view_module": "@jupyter-widgets/base", "_view_module_version": "1.2.0", "_view_name": "StyleView", "description_width": "" } }, "53865d3f918e468ab53504133b127973": { "model_module": "@jupyter-widgets/controls", "model_module_version": "1.5.0", "model_name": "DescriptionStyleModel", "state": { "_model_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_model_name": "DescriptionStyleModel", "_view_count": null, "_view_module": "@jupyter-widgets/base", "_view_module_version": "1.2.0", "_view_name": "StyleView", "description_width": "" } }, "554cff1a83d44bd2bbd36fd43acac7e2": { "model_module": "@jupyter-widgets/controls", "model_module_version": "1.5.0", "model_name": "FloatProgressModel", "state": { "_dom_classes": [], "_model_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_model_name": "FloatProgressModel", "_view_count": null, "_view_module": "@jupyter-widgets/controls", "_view_module_version": "1.5.0", "_view_name": "ProgressView", "bar_style": "success", "description": "", "description_tooltip": null, "layout": "IPY_MODEL_6c60c8291e734f549e6c5a46b427b974", "max": 1, "min": 0, "orientation": "horizontal", "style": "IPY_MODEL_de88640505c24928904a3c76bda31c70", "value": 1 } }, "5afdb88e0159462e98773560e3dad439": { "model_module": "@jupyter-widgets/controls", "model_module_version": "1.5.0", "model_name": "HBoxModel", "state": { "_dom_classes": [], "_model_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_model_name": "HBoxModel", "_view_count": null, "_view_module": "@jupyter-widgets/controls", "_view_module_version": "1.5.0", "_view_name": "HBoxView", "box_style": "", "children": [ "IPY_MODEL_f7bc4df675a141e380d965138552a142", "IPY_MODEL_d7bf8b49145843ac98a6de424e628729", "IPY_MODEL_8fb17faf68524de2b73321d71b80b407" ], "layout": "IPY_MODEL_45b569d733f944d29cefae8a5d13b215" } }, "5cb841b49eaa429e8616ec4b78f501e9": { "model_module": "@jupyter-widgets/controls", "model_module_version": "1.5.0", "model_name": "ProgressStyleModel", "state": { "_model_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_model_name": "ProgressStyleModel", "_view_count": null, "_view_module": "@jupyter-widgets/base", "_view_module_version": "1.2.0", "_view_name": "StyleView", "bar_color": null, "description_width": "" } }, "5f19dab8c6da4050bc47fd78838f7530": { "model_module": "@jupyter-widgets/controls", "model_module_version": "1.5.0", "model_name": "ProgressStyleModel", "state": { "_model_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_model_name": "ProgressStyleModel", "_view_count": null, "_view_module": "@jupyter-widgets/base", "_view_module_version": "1.2.0", "_view_name": "StyleView", "bar_color": null, "description_width": "" } }, "6259ffc3ef674df985fd3fa4334f9c8e": { "model_module": "@jupyter-widgets/controls", "model_module_version": "1.5.0", "model_name": "HTMLModel", "state": { "_dom_classes": [], "_model_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_model_name": "HTMLModel", "_view_count": null, "_view_module": "@jupyter-widgets/controls", "_view_module_version": "1.5.0", "_view_name": "HTMLView", "description": "", "description_tooltip": null, "layout": "IPY_MODEL_4a405d391b974e58a2c4fe00d4bb5815", "placeholder": "", "style": "IPY_MODEL_2958af7c9cdb46038e0336d6b7c6773e", "value": "Batches: 100%" } }, "63f34c3d43bb4fdd9faeb6161fd77285": { "model_module": "@jupyter-widgets/base", "model_module_version": "1.2.0", "model_name": "LayoutModel", "state": { "_model_module": "@jupyter-widgets/base", "_model_module_version": "1.2.0", "_model_name": "LayoutModel", "_view_count": null, "_view_module": "@jupyter-widgets/base", "_view_module_version": "1.2.0", "_view_name": "LayoutView", "align_content": null, "align_items": null, "align_self": null, "border": null, "bottom": null, "display": null, "flex": null, "flex_flow": null, "grid_area": null, "grid_auto_columns": null, "grid_auto_flow": null, "grid_auto_rows": null, "grid_column": null, "grid_gap": null, "grid_row": null, "grid_template_areas": null, "grid_template_columns": null, "grid_template_rows": null, "height": null, "justify_content": null, "justify_items": null, "left": null, "margin": null, "max_height": null, "max_width": null, "min_height": null, "min_width": null, "object_fit": null, "object_position": null, "order": null, "overflow": null, "overflow_x": null, "overflow_y": null, "padding": null, "right": null, "top": null, "visibility": null, "width": null } }, "66c92a8a89234a61a8c688cf1c3e29a1": { "model_module": "@jupyter-widgets/base", "model_module_version": "1.2.0", "model_name": "LayoutModel", "state": { "_model_module": "@jupyter-widgets/base", "_model_module_version": "1.2.0", "_model_name": "LayoutModel", "_view_count": null, "_view_module": "@jupyter-widgets/base", "_view_module_version": "1.2.0", "_view_name": "LayoutView", "align_content": null, "align_items": null, "align_self": null, "border": null, "bottom": null, "display": null, "flex": null, "flex_flow": null, "grid_area": null, "grid_auto_columns": null, "grid_auto_flow": null, "grid_auto_rows": null, "grid_column": null, "grid_gap": null, "grid_row": null, "grid_template_areas": null, "grid_template_columns": null, "grid_template_rows": null, "height": null, "justify_content": null, "justify_items": null, "left": null, "margin": null, "max_height": null, "max_width": null, "min_height": null, "min_width": null, "object_fit": null, "object_position": null, "order": null, "overflow": null, "overflow_x": null, "overflow_y": null, "padding": null, "right": null, "top": null, "visibility": null, "width": null } }, "6c60c8291e734f549e6c5a46b427b974": { "model_module": "@jupyter-widgets/base", "model_module_version": "1.2.0", "model_name": "LayoutModel", "state": { "_model_module": "@jupyter-widgets/base", "_model_module_version": "1.2.0", "_model_name": "LayoutModel", "_view_count": null, "_view_module": "@jupyter-widgets/base", "_view_module_version": "1.2.0", "_view_name": "LayoutView", "align_content": null, "align_items": null, "align_self": null, "border": null, "bottom": null, "display": null, "flex": null, "flex_flow": null, "grid_area": null, "grid_auto_columns": null, "grid_auto_flow": null, "grid_auto_rows": null, "grid_column": null, "grid_gap": null, "grid_row": null, "grid_template_areas": null, "grid_template_columns": null, "grid_template_rows": null, "height": null, "justify_content": null, "justify_items": null, "left": null, "margin": null, "max_height": null, "max_width": null, "min_height": null, "min_width": null, "object_fit": null, "object_position": null, "order": null, "overflow": null, "overflow_x": null, "overflow_y": null, "padding": null, "right": null, "top": null, "visibility": null, "width": null } }, "6e4ce98853c84beca11471e7ea9d97df": { "model_module": "@jupyter-widgets/controls", "model_module_version": "1.5.0", "model_name": "FloatProgressModel", "state": { "_dom_classes": [], "_model_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_model_name": "FloatProgressModel", "_view_count": null, "_view_module": "@jupyter-widgets/controls", "_view_module_version": "1.5.0", "_view_name": "ProgressView", "bar_style": "success", "description": "", "description_tooltip": null, "layout": "IPY_MODEL_a0ac7ee92d994c7b9b74e580ab2acdf7", "max": 1, "min": 0, "orientation": "horizontal", "style": "IPY_MODEL_118b359b83304ae59fad57e28f621645", "value": 1 } }, "6ede3649e8c24015b3ca77490568bfcd": { "model_module": "@jupyter-widgets/controls", "model_module_version": "1.5.0", "model_name": "FloatProgressModel", "state": { "_dom_classes": [], "_model_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_model_name": "FloatProgressModel", "_view_count": null, "_view_module": "@jupyter-widgets/controls", "_view_module_version": "1.5.0", "_view_name": "ProgressView", "bar_style": "success", "description": "", "description_tooltip": null, "layout": "IPY_MODEL_f10237315e794539a00ca82bfff930be", "max": 1, "min": 0, "orientation": "horizontal", "style": "IPY_MODEL_ca09d2207b00456da4c37b5a782a190c", "value": 1 } }, "753dbe7891a143118b55eccf8c252e03": { "model_module": "@jupyter-widgets/base", "model_module_version": "1.2.0", "model_name": "LayoutModel", "state": { "_model_module": "@jupyter-widgets/base", "_model_module_version": "1.2.0", "_model_name": "LayoutModel", "_view_count": null, "_view_module": "@jupyter-widgets/base", "_view_module_version": "1.2.0", "_view_name": "LayoutView", "align_content": null, "align_items": null, "align_self": null, "border": null, "bottom": null, "display": null, "flex": null, "flex_flow": null, "grid_area": null, "grid_auto_columns": null, "grid_auto_flow": null, "grid_auto_rows": null, "grid_column": null, "grid_gap": null, "grid_row": null, "grid_template_areas": null, "grid_template_columns": null, "grid_template_rows": null, "height": null, "justify_content": null, "justify_items": null, "left": null, "margin": null, "max_height": null, "max_width": null, "min_height": null, "min_width": null, "object_fit": null, "object_position": null, "order": null, "overflow": null, "overflow_x": null, "overflow_y": null, "padding": null, "right": null, "top": null, "visibility": null, "width": null } }, "8fb17faf68524de2b73321d71b80b407": { "model_module": "@jupyter-widgets/controls", "model_module_version": "1.5.0", "model_name": "HTMLModel", "state": { "_dom_classes": [], "_model_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_model_name": "HTMLModel", "_view_count": null, "_view_module": "@jupyter-widgets/controls", "_view_module_version": "1.5.0", "_view_name": "HTMLView", "description": "", "description_tooltip": null, "layout": "IPY_MODEL_277101c35a784e6caf455a13cd9b8e59", "placeholder": "", "style": "IPY_MODEL_d06666f765764f949e1876f2d5d67242", "value": " 1/1 [00:01<00:00, 1.68s/it]" } }, "9054d3825edb49cb9c35d24023f50c03": { "model_module": "@jupyter-widgets/base", "model_module_version": "1.2.0", "model_name": "LayoutModel", "state": { "_model_module": "@jupyter-widgets/base", "_model_module_version": "1.2.0", "_model_name": "LayoutModel", "_view_count": null, "_view_module": "@jupyter-widgets/base", "_view_module_version": "1.2.0", "_view_name": "LayoutView", "align_content": null, "align_items": null, "align_self": null, "border": null, "bottom": null, "display": null, "flex": null, "flex_flow": null, "grid_area": null, "grid_auto_columns": null, "grid_auto_flow": null, "grid_auto_rows": null, "grid_column": null, "grid_gap": null, "grid_row": null, "grid_template_areas": null, "grid_template_columns": null, "grid_template_rows": null, "height": null, "justify_content": null, "justify_items": null, "left": null, "margin": null, "max_height": null, "max_width": null, "min_height": null, "min_width": null, "object_fit": null, "object_position": null, "order": null, "overflow": null, "overflow_x": null, "overflow_y": null, "padding": null, "right": null, "top": null, "visibility": null, "width": null } }, "92135b9cb201475681ee0886887c84a8": { "model_module": "@jupyter-widgets/base", "model_module_version": "1.2.0", "model_name": "LayoutModel", "state": { "_model_module": "@jupyter-widgets/base", "_model_module_version": "1.2.0", "_model_name": "LayoutModel", "_view_count": null, "_view_module": "@jupyter-widgets/base", "_view_module_version": "1.2.0", "_view_name": "LayoutView", "align_content": null, "align_items": null, "align_self": null, "border": null, "bottom": null, "display": null, "flex": null, "flex_flow": null, "grid_area": null, "grid_auto_columns": null, "grid_auto_flow": null, "grid_auto_rows": null, "grid_column": null, "grid_gap": null, "grid_row": null, "grid_template_areas": null, "grid_template_columns": null, "grid_template_rows": null, "height": null, "justify_content": null, "justify_items": null, "left": null, "margin": null, "max_height": null, "max_width": null, "min_height": null, "min_width": null, "object_fit": null, "object_position": null, "order": null, "overflow": null, "overflow_x": null, "overflow_y": null, "padding": null, "right": null, "top": null, "visibility": null, "width": null } }, "9277709ad9154d7b8f37d08db84ee425": { "model_module": "@jupyter-widgets/controls", "model_module_version": "1.5.0", "model_name": "HTMLModel", "state": { "_dom_classes": [], "_model_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_model_name": "HTMLModel", "_view_count": null, "_view_module": "@jupyter-widgets/controls", "_view_module_version": "1.5.0", "_view_name": "HTMLView", "description": "", "description_tooltip": null, "layout": "IPY_MODEL_a447ea9af3e14e5e94eb14ed8dd3c0de", "placeholder": "", "style": "IPY_MODEL_0243626d7ef44ef2b90e8fed5c13183d", "value": " 1/1 [00:02<00:00, 2.65s/it]" } }, "a0ac7ee92d994c7b9b74e580ab2acdf7": { "model_module": "@jupyter-widgets/base", "model_module_version": "1.2.0", "model_name": "LayoutModel", "state": { "_model_module": "@jupyter-widgets/base", "_model_module_version": "1.2.0", "_model_name": "LayoutModel", "_view_count": null, "_view_module": "@jupyter-widgets/base", "_view_module_version": "1.2.0", "_view_name": "LayoutView", "align_content": null, "align_items": null, "align_self": null, "border": null, "bottom": null, "display": null, "flex": null, "flex_flow": null, "grid_area": null, "grid_auto_columns": null, "grid_auto_flow": null, "grid_auto_rows": null, "grid_column": null, "grid_gap": null, "grid_row": null, "grid_template_areas": null, "grid_template_columns": null, "grid_template_rows": null, "height": null, "justify_content": null, "justify_items": null, "left": null, "margin": null, "max_height": null, "max_width": null, "min_height": null, "min_width": null, "object_fit": null, "object_position": null, "order": null, "overflow": null, "overflow_x": null, "overflow_y": null, "padding": null, "right": null, "top": null, "visibility": null, "width": null } }, "a0be415018644c3cac098ab9b19c2391": { "model_module": "@jupyter-widgets/controls", "model_module_version": "1.5.0", "model_name": "HTMLModel", "state": { "_dom_classes": [], "_model_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_model_name": "HTMLModel", "_view_count": null, "_view_module": "@jupyter-widgets/controls", "_view_module_version": "1.5.0", "_view_name": "HTMLView", "description": "", "description_tooltip": null, "layout": "IPY_MODEL_e4b1dfe159304c5f88766b33e85a5c19", "placeholder": "", "style": "IPY_MODEL_2100363a158b4488a58620983aa5bdd4", "value": "Batches: 100%" } }, "a447ea9af3e14e5e94eb14ed8dd3c0de": { "model_module": "@jupyter-widgets/base", "model_module_version": "1.2.0", "model_name": "LayoutModel", "state": { "_model_module": "@jupyter-widgets/base", "_model_module_version": "1.2.0", "_model_name": "LayoutModel", "_view_count": null, "_view_module": "@jupyter-widgets/base", "_view_module_version": "1.2.0", "_view_name": "LayoutView", "align_content": null, "align_items": null, "align_self": null, "border": null, "bottom": null, "display": null, "flex": null, "flex_flow": null, "grid_area": null, "grid_auto_columns": null, "grid_auto_flow": null, "grid_auto_rows": null, "grid_column": null, "grid_gap": null, "grid_row": null, "grid_template_areas": null, "grid_template_columns": null, "grid_template_rows": null, "height": null, "justify_content": null, "justify_items": null, "left": null, "margin": null, "max_height": null, "max_width": null, "min_height": null, "min_width": null, "object_fit": null, "object_position": null, "order": null, "overflow": null, "overflow_x": null, "overflow_y": null, "padding": null, "right": null, "top": null, "visibility": null, "width": null } }, "a6a1eb412f204578b80e5b6717c1e3a5": { "model_module": "@jupyter-widgets/controls", "model_module_version": "1.5.0", "model_name": "DescriptionStyleModel", "state": { "_model_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_model_name": "DescriptionStyleModel", "_view_count": null, "_view_module": "@jupyter-widgets/base", "_view_module_version": "1.2.0", "_view_name": "StyleView", "description_width": "" } }, "ab1f339cba094c918fc5507f8361de5c": { "model_module": "@jupyter-widgets/base", "model_module_version": "1.2.0", "model_name": "LayoutModel", "state": { "_model_module": "@jupyter-widgets/base", "_model_module_version": "1.2.0", "_model_name": "LayoutModel", "_view_count": null, "_view_module": "@jupyter-widgets/base", "_view_module_version": "1.2.0", "_view_name": "LayoutView", "align_content": null, "align_items": null, "align_self": null, "border": null, "bottom": null, "display": null, "flex": null, "flex_flow": null, "grid_area": null, "grid_auto_columns": null, "grid_auto_flow": null, "grid_auto_rows": null, "grid_column": null, "grid_gap": null, "grid_row": null, "grid_template_areas": null, "grid_template_columns": null, "grid_template_rows": null, "height": null, "justify_content": null, "justify_items": null, "left": null, "margin": null, "max_height": null, "max_width": null, "min_height": null, "min_width": null, "object_fit": null, "object_position": null, "order": null, "overflow": null, "overflow_x": null, "overflow_y": null, "padding": null, "right": null, "top": null, "visibility": null, "width": null } }, "b66984cc5de541a5801a1e6e54d40daf": { "model_module": "@jupyter-widgets/controls", "model_module_version": "1.5.0", "model_name": "HTMLModel", "state": { "_dom_classes": [], "_model_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_model_name": "HTMLModel", "_view_count": null, "_view_module": "@jupyter-widgets/controls", "_view_module_version": "1.5.0", "_view_name": "HTMLView", "description": "", "description_tooltip": null, "layout": "IPY_MODEL_efd68f6dc0b3428e8f5fc830c1bf2341", "placeholder": "", "style": "IPY_MODEL_4ad57f5d8a824afab639e8606ee43ca6", "value": " 1/1 [00:00<00:00, 5.36it/s]" } }, "bbb93c771a9c453bb90e729b1f73b931": { "model_module": "@jupyter-widgets/base", "model_module_version": "1.2.0", "model_name": "LayoutModel", "state": { "_model_module": "@jupyter-widgets/base", "_model_module_version": "1.2.0", "_model_name": "LayoutModel", "_view_count": null, "_view_module": "@jupyter-widgets/base", "_view_module_version": "1.2.0", "_view_name": "LayoutView", "align_content": null, "align_items": null, "align_self": null, "border": null, "bottom": null, "display": null, "flex": null, "flex_flow": null, "grid_area": null, "grid_auto_columns": null, "grid_auto_flow": null, "grid_auto_rows": null, "grid_column": null, "grid_gap": null, "grid_row": null, "grid_template_areas": null, "grid_template_columns": null, "grid_template_rows": null, "height": null, "justify_content": null, "justify_items": null, "left": null, "margin": null, "max_height": null, "max_width": null, "min_height": null, "min_width": null, "object_fit": null, "object_position": null, "order": null, "overflow": null, "overflow_x": null, "overflow_y": null, "padding": null, "right": null, "top": null, "visibility": null, "width": null } }, "bcf4679dda2d4767a0a24cbf236ca76e": { "model_module": "@jupyter-widgets/controls", "model_module_version": "1.5.0", "model_name": "HTMLModel", "state": { "_dom_classes": [], "_model_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_model_name": "HTMLModel", "_view_count": null, "_view_module": "@jupyter-widgets/controls", "_view_module_version": "1.5.0", "_view_name": "HTMLView", "description": "", "description_tooltip": null, "layout": "IPY_MODEL_bbb93c771a9c453bb90e729b1f73b931", "placeholder": "", "style": "IPY_MODEL_351928faa62543128e0bd29bf89bbf79", "value": "Batches: 100%" } }, "ca09d2207b00456da4c37b5a782a190c": { "model_module": "@jupyter-widgets/controls", "model_module_version": "1.5.0", "model_name": "ProgressStyleModel", "state": { "_model_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_model_name": "ProgressStyleModel", "_view_count": null, "_view_module": "@jupyter-widgets/base", "_view_module_version": "1.2.0", "_view_name": "StyleView", "bar_color": null, "description_width": "" } }, "ce7de1af99434ad38a9382e7253dbfc0": { "model_module": "@jupyter-widgets/controls", "model_module_version": "1.5.0", "model_name": "DescriptionStyleModel", "state": { "_model_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_model_name": "DescriptionStyleModel", "_view_count": null, "_view_module": "@jupyter-widgets/base", "_view_module_version": "1.2.0", "_view_name": "StyleView", "description_width": "" } }, "d0381718fc8b49a6ac7e7fe85cabba90": { "model_module": "@jupyter-widgets/controls", "model_module_version": "1.5.0", "model_name": "HTMLModel", "state": { "_dom_classes": [], "_model_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_model_name": "HTMLModel", "_view_count": null, "_view_module": "@jupyter-widgets/controls", "_view_module_version": "1.5.0", "_view_name": "HTMLView", "description": "", "description_tooltip": null, "layout": "IPY_MODEL_fc086d0dd1a745308c59ae219ae135c5", "placeholder": "", "style": "IPY_MODEL_15d3ff07f1c54e58b51d452caca01209", "value": " 1/1 [00:00<00:00, 14.36it/s]" } }, "d06666f765764f949e1876f2d5d67242": { "model_module": "@jupyter-widgets/controls", "model_module_version": "1.5.0", "model_name": "DescriptionStyleModel", "state": { "_model_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_model_name": "DescriptionStyleModel", "_view_count": null, "_view_module": "@jupyter-widgets/base", "_view_module_version": "1.2.0", "_view_name": "StyleView", "description_width": "" } }, "d124b09896934d289df649375f455a8e": { "model_module": "@jupyter-widgets/controls", "model_module_version": "1.5.0", "model_name": "HTMLModel", "state": { "_dom_classes": [], "_model_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_model_name": "HTMLModel", "_view_count": null, "_view_module": "@jupyter-widgets/controls", "_view_module_version": "1.5.0", "_view_name": "HTMLView", "description": "", "description_tooltip": null, "layout": "IPY_MODEL_753dbe7891a143118b55eccf8c252e03", "placeholder": "", "style": "IPY_MODEL_ce7de1af99434ad38a9382e7253dbfc0", "value": "Batches: 100%" } }, "d7bf8b49145843ac98a6de424e628729": { "model_module": "@jupyter-widgets/controls", "model_module_version": "1.5.0", "model_name": "FloatProgressModel", "state": { "_dom_classes": [], "_model_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_model_name": "FloatProgressModel", "_view_count": null, "_view_module": "@jupyter-widgets/controls", "_view_module_version": "1.5.0", "_view_name": "ProgressView", "bar_style": "success", "description": "", "description_tooltip": null, "layout": "IPY_MODEL_17603dd7fedf4798a74533fbfd5bb421", "max": 1, "min": 0, "orientation": "horizontal", "style": "IPY_MODEL_5f19dab8c6da4050bc47fd78838f7530", "value": 1 } }, "de88640505c24928904a3c76bda31c70": { "model_module": "@jupyter-widgets/controls", "model_module_version": "1.5.0", "model_name": "ProgressStyleModel", "state": { "_model_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_model_name": "ProgressStyleModel", "_view_count": null, "_view_module": "@jupyter-widgets/base", "_view_module_version": "1.2.0", "_view_name": "StyleView", "bar_color": null, "description_width": "" } }, "e1ef246e3e6c4359b7b61c341119e121": { "model_module": "@jupyter-widgets/base", "model_module_version": "1.2.0", "model_name": "LayoutModel", "state": { "_model_module": "@jupyter-widgets/base", "_model_module_version": "1.2.0", "_model_name": "LayoutModel", "_view_count": null, "_view_module": "@jupyter-widgets/base", "_view_module_version": "1.2.0", "_view_name": "LayoutView", "align_content": null, "align_items": null, "align_self": null, "border": null, "bottom": null, "display": null, "flex": null, "flex_flow": null, "grid_area": null, "grid_auto_columns": null, "grid_auto_flow": null, "grid_auto_rows": null, "grid_column": null, "grid_gap": null, "grid_row": null, "grid_template_areas": null, "grid_template_columns": null, "grid_template_rows": null, "height": null, "justify_content": null, "justify_items": null, "left": null, "margin": null, "max_height": null, "max_width": null, "min_height": null, "min_width": null, "object_fit": null, "object_position": null, "order": null, "overflow": null, "overflow_x": null, "overflow_y": null, "padding": null, "right": null, "top": null, "visibility": null, "width": null } }, "e4b1dfe159304c5f88766b33e85a5c19": { "model_module": "@jupyter-widgets/base", "model_module_version": "1.2.0", "model_name": "LayoutModel", "state": { "_model_module": "@jupyter-widgets/base", "_model_module_version": "1.2.0", "_model_name": "LayoutModel", "_view_count": null, "_view_module": "@jupyter-widgets/base", "_view_module_version": "1.2.0", "_view_name": "LayoutView", "align_content": null, "align_items": null, "align_self": null, "border": null, "bottom": null, "display": null, "flex": null, "flex_flow": null, "grid_area": null, "grid_auto_columns": null, "grid_auto_flow": null, "grid_auto_rows": null, "grid_column": null, "grid_gap": null, "grid_row": null, "grid_template_areas": null, "grid_template_columns": null, "grid_template_rows": null, "height": null, "justify_content": null, "justify_items": null, "left": null, "margin": null, "max_height": null, "max_width": null, "min_height": null, "min_width": null, "object_fit": null, "object_position": null, "order": null, "overflow": null, "overflow_x": null, "overflow_y": null, "padding": null, "right": null, "top": null, "visibility": null, "width": null } }, "ee1f4a0c85e44a3b849283337743a8d4": { "model_module": "@jupyter-widgets/controls", "model_module_version": "1.5.0", "model_name": "DescriptionStyleModel", "state": { "_model_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_model_name": "DescriptionStyleModel", "_view_count": null, "_view_module": "@jupyter-widgets/base", "_view_module_version": "1.2.0", "_view_name": "StyleView", "description_width": "" } }, "efd68f6dc0b3428e8f5fc830c1bf2341": { "model_module": "@jupyter-widgets/base", "model_module_version": "1.2.0", "model_name": "LayoutModel", "state": { "_model_module": "@jupyter-widgets/base", "_model_module_version": "1.2.0", "_model_name": "LayoutModel", "_view_count": null, "_view_module": "@jupyter-widgets/base", "_view_module_version": "1.2.0", "_view_name": "LayoutView", "align_content": null, "align_items": null, "align_self": null, "border": null, "bottom": null, "display": null, "flex": null, "flex_flow": null, "grid_area": null, "grid_auto_columns": null, "grid_auto_flow": null, "grid_auto_rows": null, "grid_column": null, "grid_gap": null, "grid_row": null, "grid_template_areas": null, "grid_template_columns": null, "grid_template_rows": null, "height": null, "justify_content": null, "justify_items": null, "left": null, "margin": null, "max_height": null, "max_width": null, "min_height": null, "min_width": null, "object_fit": null, "object_position": null, "order": null, "overflow": null, "overflow_x": null, "overflow_y": null, "padding": null, "right": null, "top": null, "visibility": null, "width": null } }, "f10237315e794539a00ca82bfff930be": { "model_module": "@jupyter-widgets/base", "model_module_version": "1.2.0", "model_name": "LayoutModel", "state": { "_model_module": "@jupyter-widgets/base", "_model_module_version": "1.2.0", "_model_name": "LayoutModel", "_view_count": null, "_view_module": "@jupyter-widgets/base", "_view_module_version": "1.2.0", "_view_name": "LayoutView", "align_content": null, "align_items": null, "align_self": null, "border": null, "bottom": null, "display": null, "flex": null, "flex_flow": null, "grid_area": null, "grid_auto_columns": null, "grid_auto_flow": null, "grid_auto_rows": null, "grid_column": null, "grid_gap": null, "grid_row": null, "grid_template_areas": null, "grid_template_columns": null, "grid_template_rows": null, "height": null, "justify_content": null, "justify_items": null, "left": null, "margin": null, "max_height": null, "max_width": null, "min_height": null, "min_width": null, "object_fit": null, "object_position": null, "order": null, "overflow": null, "overflow_x": null, "overflow_y": null, "padding": null, "right": null, "top": null, "visibility": null, "width": null } }, "f3f1f2487d6f455caeb6ec71a2d51ee2": { "model_module": "@jupyter-widgets/base", "model_module_version": "1.2.0", "model_name": "LayoutModel", "state": { "_model_module": "@jupyter-widgets/base", "_model_module_version": "1.2.0", "_model_name": "LayoutModel", "_view_count": null, "_view_module": "@jupyter-widgets/base", "_view_module_version": "1.2.0", "_view_name": "LayoutView", "align_content": null, "align_items": null, "align_self": null, "border": null, "bottom": null, "display": null, "flex": null, "flex_flow": null, "grid_area": null, "grid_auto_columns": null, "grid_auto_flow": null, "grid_auto_rows": null, "grid_column": null, "grid_gap": null, "grid_row": null, "grid_template_areas": null, "grid_template_columns": null, "grid_template_rows": null, "height": null, "justify_content": null, "justify_items": null, "left": null, "margin": null, "max_height": null, "max_width": null, "min_height": null, "min_width": null, "object_fit": null, "object_position": null, "order": null, "overflow": null, "overflow_x": null, "overflow_y": null, "padding": null, "right": null, "top": null, "visibility": null, "width": null } }, "f7bc4df675a141e380d965138552a142": { "model_module": "@jupyter-widgets/controls", "model_module_version": "1.5.0", "model_name": "HTMLModel", "state": { "_dom_classes": [], "_model_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_model_name": "HTMLModel", "_view_count": null, "_view_module": "@jupyter-widgets/controls", "_view_module_version": "1.5.0", "_view_name": "HTMLView", "description": "", "description_tooltip": null, "layout": "IPY_MODEL_fdd057a4506f4f119d945bab5b930799", "placeholder": "", "style": "IPY_MODEL_53865d3f918e468ab53504133b127973", "value": "Batches: 100%" } }, "fc086d0dd1a745308c59ae219ae135c5": { "model_module": "@jupyter-widgets/base", "model_module_version": "1.2.0", "model_name": "LayoutModel", "state": { "_model_module": "@jupyter-widgets/base", "_model_module_version": "1.2.0", "_model_name": "LayoutModel", "_view_count": null, "_view_module": "@jupyter-widgets/base", "_view_module_version": "1.2.0", "_view_name": "LayoutView", "align_content": null, "align_items": null, "align_self": null, "border": null, "bottom": null, "display": null, "flex": null, "flex_flow": null, "grid_area": null, "grid_auto_columns": null, "grid_auto_flow": null, "grid_auto_rows": null, "grid_column": null, "grid_gap": null, "grid_row": null, "grid_template_areas": null, "grid_template_columns": null, "grid_template_rows": null, "height": null, "justify_content": null, "justify_items": null, "left": null, "margin": null, "max_height": null, "max_width": null, "min_height": null, "min_width": null, "object_fit": null, "object_position": null, "order": null, "overflow": null, "overflow_x": null, "overflow_y": null, "padding": null, "right": null, "top": null, "visibility": null, "width": null } }, "fd3daaf9093d45d8a9d39b87835f4582": { "model_module": "@jupyter-widgets/base", "model_module_version": "1.2.0", "model_name": "LayoutModel", "state": { "_model_module": "@jupyter-widgets/base", "_model_module_version": "1.2.0", "_model_name": "LayoutModel", "_view_count": null, "_view_module": "@jupyter-widgets/base", "_view_module_version": "1.2.0", "_view_name": "LayoutView", "align_content": null, "align_items": null, "align_self": null, "border": null, "bottom": null, "display": null, "flex": null, "flex_flow": null, "grid_area": null, "grid_auto_columns": null, "grid_auto_flow": null, "grid_auto_rows": null, "grid_column": null, "grid_gap": null, "grid_row": null, "grid_template_areas": null, "grid_template_columns": null, "grid_template_rows": null, "height": null, "justify_content": null, "justify_items": null, "left": null, "margin": null, "max_height": null, "max_width": null, "min_height": null, "min_width": null, "object_fit": null, "object_position": null, "order": null, "overflow": null, "overflow_x": null, "overflow_y": null, "padding": null, "right": null, "top": null, "visibility": null, "width": null } }, "fdd057a4506f4f119d945bab5b930799": { "model_module": "@jupyter-widgets/base", "model_module_version": "1.2.0", "model_name": "LayoutModel", "state": { "_model_module": "@jupyter-widgets/base", "_model_module_version": "1.2.0", "_model_name": "LayoutModel", "_view_count": null, "_view_module": "@jupyter-widgets/base", "_view_module_version": "1.2.0", "_view_name": "LayoutView", "align_content": null, "align_items": null, "align_self": null, "border": null, "bottom": null, "display": null, "flex": null, "flex_flow": null, "grid_area": null, "grid_auto_columns": null, "grid_auto_flow": null, "grid_auto_rows": null, "grid_column": null, "grid_gap": null, "grid_row": null, "grid_template_areas": null, "grid_template_columns": null, "grid_template_rows": null, "height": null, "justify_content": null, "justify_items": null, "left": null, "margin": null, "max_height": null, "max_width": null, "min_height": null, "min_width": null, "object_fit": null, "object_position": null, "order": null, "overflow": null, "overflow_x": null, "overflow_y": null, "padding": null, "right": null, "top": null, "visibility": null, "width": null } } } } }, "nbformat": 4, "nbformat_minor": 5 }