{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "# OCI Object Store Demo with Llama Stack\n", "This notebook demonstrates how to set up OCI Object Storage with Llama Stack." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Step 1: Update config.yaml\n", "Update your `config.yaml` to include the S3 configuration for OCI Object Storage as described in the [README](src/llama_stack/providers/remote/files/s3/README.md).\n", "### Example config.yaml configuration\n", "```yaml\n", "provider_type: remote::s3\n", "config:\n", " bucket_name: \"${env.S3_BUCKET_NAME}\"\n", " region: \"${env.AWS_REGION:=us-east-1}\"\n", " aws_access_key_id: \"${env.AWS_ACCESS_KEY_ID:=}\"\n", " aws_secret_access_key: \"${env.AWS_SECRET_ACCESS_KEY:=}\"\n", " endpoint_url: \"${env.S3_ENDPOINT_URL:=}\"\n", " metadata_store:\n", " table_name: files_metadata\n", " backend: sql_default\n", "```" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Step 2: Set Environment Variables\n", "Create a `.env` file with your OCI credentials and bucket details. \n", "For more information on generating the access/secret keys, visit this [document](https://docs.oracle.com/en-us/iaas/Content/Object/Tasks/s3compatibleapi.htm)\n", "For information on the 'checksum' variables see this [document](https://www.ateam-oracle.com/post/using-oci-os-s3-interface)\n", "\n", "### Example .env file content\n", "```\n", "AWS_ACCESS_KEY_ID=OCI_ACCESS_KEY \n", "AWS_SECRET_ACCESS_KEY=OCI_SECRET_KEY \n", "S3_BUCKET_NAME=OCI_BUCKET_NAME \n", "S3_ENDPOINT_URL=https://.compat.objectstorage..oci.customer-oci.com \n", "AWS_REQUEST_CHECKSUM_CALCULATION=when_required \n", "AWS_RESPONSE_CHECKSUM_VALIDATION=when_required \n", "```" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Step 3: Run Llama Stack Locally\n", "Run the following command to start the Llama Stack server locally:\n", "\n", "To set-up your envionment and first-time run of llama-stack visit the repo and view the [CONTRIBUTING](https://github.com/llamastack/llama-stack/blob/main/CONTRIBUTING.md) document" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "!uv run --env-file=.env llama stack run oci" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Step 4: Upload and List Files using Files API" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "import requests\n", "\n", "# Upload a file\n", "source = \"https://www.paulgraham.com/greatwork.html\"\n", "response = requests.get(source)\n", "files = {'file': ('greatwork.html', response.content, 'text/html')}\n", "data = {'purpose': 'assistants'}\n", "response = requests.post('http://0.0.0.0:8321/v1/files', files=files, data=data)\n", "print(response.text)\n", "\n", "# List uploaded files\n", "items = requests.get('http://0.0.0.0:8321/v1/files')\n", "for item in items.json()['data']:\n", " print(item['id'])" ] } ], "metadata": { "kernelspec": { "display_name": "llama-stack", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.12.12" } }, "nbformat": 4, "nbformat_minor": 2 }