mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-12-17 22:57:13 +00:00
Some checks failed
SqlStore Integration Tests / test-postgres (3.13) (push) Failing after 0s
Integration Auth Tests / test-matrix (oauth2_token) (push) Failing after 1s
Test External Providers Installed via Module / test-external-providers-from-module (venv) (push) Has been skipped
Integration Tests (Replay) / generate-matrix (push) Successful in 3s
SqlStore Integration Tests / test-postgres (3.12) (push) Failing after 5s
API Conformance Tests / check-schema-compatibility (push) Successful in 14s
Python Package Build Test / build (3.12) (push) Successful in 16s
Python Package Build Test / build (3.13) (push) Successful in 17s
Test External API and Providers / test-external (venv) (push) Failing after 30s
Vector IO Integration Tests / test-matrix (push) Failing after 50s
UI Tests / ui-tests (22) (push) Successful in 1m1s
Unit Tests / unit-tests (3.12) (push) Failing after 1m39s
Unit Tests / unit-tests (3.13) (push) Failing after 1m43s
Integration Tests (Replay) / Integration Tests (, , , client=, ) (push) Failing after 2m47s
Pre-commit / pre-commit (22) (push) Successful in 3m42s
# What does this PR do? The PR validates and allow access to OCI object-storage through the S3 compatibility API. Additional documentation for OCI is supplied, in notebook form, as well. ## Test Plan <!-- Describe the tests you ran to verify your changes with result summaries. *Provide clear instructions so the plan can be easily re-executed.* --> --------- Co-authored-by: raghotham <rsm@meta.com>
122 lines
3.7 KiB
Text
122 lines
3.7 KiB
Text
{
|
|
"cells": [
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {},
|
|
"source": [
|
|
"# OCI Object Store Demo with Llama Stack\n",
|
|
"This notebook demonstrates how to set up OCI Object Storage with Llama Stack."
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {},
|
|
"source": [
|
|
"## Step 1: Update config.yaml\n",
|
|
"Update your `config.yaml` to include the S3 configuration for OCI Object Storage as described in the [README](src/llama_stack/providers/remote/files/s3/README.md).\n",
|
|
"### Example config.yaml configuration\n",
|
|
"```yaml\n",
|
|
"provider_type: remote::s3\n",
|
|
"config:\n",
|
|
" bucket_name: \"${env.S3_BUCKET_NAME}\"\n",
|
|
" region: \"${env.AWS_REGION:=us-east-1}\"\n",
|
|
" aws_access_key_id: \"${env.AWS_ACCESS_KEY_ID:=}\"\n",
|
|
" aws_secret_access_key: \"${env.AWS_SECRET_ACCESS_KEY:=}\"\n",
|
|
" endpoint_url: \"${env.S3_ENDPOINT_URL:=}\"\n",
|
|
" metadata_store:\n",
|
|
" table_name: files_metadata\n",
|
|
" backend: sql_default\n",
|
|
"```"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {},
|
|
"source": [
|
|
"## Step 2: Set Environment Variables\n",
|
|
"Create a `.env` file with your OCI credentials and bucket details. \n",
|
|
"For more information on generating the access/secret keys, visit this [document](https://docs.oracle.com/en-us/iaas/Content/Object/Tasks/s3compatibleapi.htm)\n",
|
|
"For information on the 'checksum' variables see this [document](https://www.ateam-oracle.com/post/using-oci-os-s3-interface)\n",
|
|
"\n",
|
|
"### Example .env file content\n",
|
|
"```\n",
|
|
"AWS_ACCESS_KEY_ID=OCI_ACCESS_KEY \n",
|
|
"AWS_SECRET_ACCESS_KEY=OCI_SECRET_KEY \n",
|
|
"S3_BUCKET_NAME=OCI_BUCKET_NAME \n",
|
|
"S3_ENDPOINT_URL=https://<namespace>.compat.objectstorage.<region>.oci.customer-oci.com \n",
|
|
"AWS_REQUEST_CHECKSUM_CALCULATION=when_required \n",
|
|
"AWS_RESPONSE_CHECKSUM_VALIDATION=when_required \n",
|
|
"```"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {},
|
|
"source": [
|
|
"## Step 3: Run Llama Stack Locally\n",
|
|
"Run the following command to start the Llama Stack server locally:\n",
|
|
"\n",
|
|
"To set-up your envionment and first-time run of llama-stack visit the repo and view the [CONTRIBUTING](https://github.com/llamastack/llama-stack/blob/main/CONTRIBUTING.md) document"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": null,
|
|
"metadata": {},
|
|
"outputs": [],
|
|
"source": [
|
|
"!uv run --env-file=.env llama stack run oci"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {},
|
|
"source": [
|
|
"## Step 4: Upload and List Files using Files API"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": null,
|
|
"metadata": {},
|
|
"outputs": [],
|
|
"source": [
|
|
"import requests\n",
|
|
"\n",
|
|
"# Upload a file\n",
|
|
"source = \"https://www.paulgraham.com/greatwork.html\"\n",
|
|
"response = requests.get(source)\n",
|
|
"files = {'file': ('greatwork.html', response.content, 'text/html')}\n",
|
|
"data = {'purpose': 'assistants'}\n",
|
|
"response = requests.post('http://0.0.0.0:8321/v1/files', files=files, data=data)\n",
|
|
"print(response.text)\n",
|
|
"\n",
|
|
"# List uploaded files\n",
|
|
"items = requests.get('http://0.0.0.0:8321/v1/files')\n",
|
|
"for item in items.json()['data']:\n",
|
|
" print(item['id'])"
|
|
]
|
|
}
|
|
],
|
|
"metadata": {
|
|
"kernelspec": {
|
|
"display_name": "llama-stack",
|
|
"language": "python",
|
|
"name": "python3"
|
|
},
|
|
"language_info": {
|
|
"codemirror_mode": {
|
|
"name": "ipython",
|
|
"version": 3
|
|
},
|
|
"file_extension": ".py",
|
|
"mimetype": "text/x-python",
|
|
"name": "python",
|
|
"nbconvert_exporter": "python",
|
|
"pygments_lexer": "ipython3",
|
|
"version": "3.12.12"
|
|
}
|
|
},
|
|
"nbformat": 4,
|
|
"nbformat_minor": 2
|
|
}
|