llama-stack-mirror/src/llama_stack
ehhuang 1f9d48cd54
feat: openai files provider (#3946)
# What does this PR do?
- Adds OpenAI files provider 
- Note that file content retrieval is pretty limited by `purpose`
https://community.openai.com/t/file-uploads-error-why-can-t-i-download-files-with-purpose-user-data/1357013?utm_source=chatgpt.com

## Test Plan
Modify run yaml to use openai files provider:
```
  files:
  - provider_id: openai
    provider_type: remote::openai
    config:
      api_key: ${env.OPENAI_API_KEY:=}
      metadata_store:
        backend: sql_default
        table_name: openai_files_metadata

# Then run files tests
❯ uv run --no-sync ./scripts/integration-tests.sh --stack-config server:ci-tests --inference-mode replay --setup ollama --suite base --pattern test_files
```
2025-10-28 16:25:03 -07:00
..
apis feat(responses)!: introduce OpenAI compatible prompts to Responses API (#3942) 2025-10-28 09:31:27 -07:00
cli fix(mypy): resolve OpenAI SDK and provider type issues (#3936) 2025-10-28 10:54:29 -07:00
core fix(inference): enable routing of models with provider_data alone (#3928) 2025-10-28 11:16:37 -07:00
distributions docs: add documentation on how to use custom run yaml in docker (#3949) 2025-10-28 16:05:44 -07:00
models fix(mypy): resolve provider utility and testing type issues (#3935) 2025-10-28 10:37:27 -07:00
providers feat: openai files provider (#3946) 2025-10-28 16:25:03 -07:00
strong_typing chore(package): migrate to src/ layout (#3920) 2025-10-27 12:02:21 -07:00
testing fix(mypy): add type stubs and fix typing issues (#3938) 2025-10-28 11:00:09 -07:00
ui chore(package): migrate to src/ layout (#3920) 2025-10-27 12:02:21 -07:00
__init__.py chore(package): migrate to src/ layout (#3920) 2025-10-27 12:02:21 -07:00
env.py chore(package): migrate to src/ layout (#3920) 2025-10-27 12:02:21 -07:00
log.py chore(package): migrate to src/ layout (#3920) 2025-10-27 12:02:21 -07:00
schema_utils.py chore(package): migrate to src/ layout (#3920) 2025-10-27 12:02:21 -07:00