llama-stack-mirror/llama_stack/ui
Ashwin Bharambe 39f33f7f12
feat(cherry-pick): fixes for 0.3.1 release (#3998)
## Summary

Cherry-picks 5 critical fixes from main to the release-0.3.x branch for
the v0.3.1 release, plus CI workflow updates.

**Note**: This recreates the cherry-picks from the closed PR #3991, now
targeting the renamed `release-0.3.x` branch (previously
`release-0.3.x-maint`).

## Commits

1. **2c56a8560** - fix(context): prevent provider data leak between
streaming requests (#3924)
- **CRITICAL SECURITY FIX**: Prevents provider credentials from leaking
between requests
   - Fixed import path for 0.3.0 compatibility

2. **ddd32b187** - fix(inference): enable routing of models with
provider_data alone (#3928)
   - Enables routing for fully qualified model IDs with provider_data
   - Resolved merge conflicts, adapted for 0.3.0 structure

3. **f7c2973aa** - fix: Avoid BadRequestError due to invalid max_tokens
(#3667)
- Fixes failures with Gemini and other providers that reject
max_tokens=0
   - Non-breaking API change

4. **d7f9da616** - fix(responses): sync conversation before yielding
terminal events in streaming (#3888)
- Ensures conversation sync executes even when streaming consumers break
early

5. **0ffa8658b** - fix(logging): ensure logs go to stderr, loggers obey
levels (#3885)
   - Fixes logging infrastructure

6. **75b49cb3c** - ci: support release branches and match client branch
(#3990)
   - Updates CI workflows to support release-X.Y.x branches
- Matches client branch from llama-stack-client-python for release
testing
   - Fixes artifact name collisions

## Adaptations for 0.3.0

- Fixed import paths: `llama_stack.core.telemetry.tracing` →
`llama_stack.providers.utils.telemetry.tracing`
- Fixed import paths: `llama_stack.core.telemetry.telemetry` →
`llama_stack.apis.telemetry`
- Changed `self.telemetry_enabled` → `self.telemetry` (0.3.0 attribute
name)
- Removed `rerank()` method that doesn't exist in 0.3.0

## Testing

All imports verified and tests should pass once CI is set up.
2025-10-30 21:51:42 -07:00
..
app fix: Remove deprecated user param in OpenAIResponseObject (#3596) 2025-09-29 13:55:59 -07:00
components fix: Remove deprecated user param in OpenAIResponseObject (#3596) 2025-09-29 13:55:59 -07:00
e2e feat(UI): Adding linter and prettier for UI (#3156) 2025-08-14 15:58:43 -06:00
hooks feat(UI): Adding linter and prettier for UI (#3156) 2025-08-14 15:58:43 -06:00
lib feat(UI): Implementing File Upload and VectorDB Creation/Configuration in Playground (#3266) 2025-08-28 05:03:31 -06:00
public feat(UI): Adding a session manager (#3203) 2025-08-21 21:11:03 -04:00
types feat(auth,ui): support github sign-in in the UI (#2545) 2025-07-08 11:02:57 -07:00
.gitignore feat(ui): add infinite scroll pagination to chat completions/responses logs table (#2466) 2025-06-18 15:28:39 -07:00
.nvmrc feat(UI): Adding linter and prettier for UI (#3156) 2025-08-14 15:58:43 -06:00
.prettierignore feat(UI): Adding linter and prettier for UI (#3156) 2025-08-14 15:58:43 -06:00
.prettierrc feat(UI): Adding linter and prettier for UI (#3156) 2025-08-14 15:58:43 -06:00
components.json feat(UI): adding MVP playground UI (#2828) 2025-07-30 19:44:16 -07:00
eslint.config.mjs feat(UI): Adding linter and prettier for UI (#3156) 2025-08-14 15:58:43 -06:00
instrumentation.ts feat(auth,ui): support github sign-in in the UI (#2545) 2025-07-08 11:02:57 -07:00
jest.config.ts feat(ui): add infinite scroll pagination to chat completions/responses logs table (#2466) 2025-06-18 15:28:39 -07:00
jest.setup.ts feat(ui): add infinite scroll pagination to chat completions/responses logs table (#2466) 2025-06-18 15:28:39 -07:00
next.config.ts feat: scaffolding for Llama Stack UI (#2149) 2025-05-14 17:22:46 -07:00
package-lock.json feat(cherry-pick): fixes for 0.3.1 release (#3998) 2025-10-30 21:51:42 -07:00
package.json build: Bump version to 0.3.0 2025-10-21 23:58:10 +00:00
playwright.config.ts feat(ui): add infinite scroll pagination to chat completions/responses logs table (#2466) 2025-06-18 15:28:39 -07:00
postcss.config.mjs feat: scaffolding for Llama Stack UI (#2149) 2025-05-14 17:22:46 -07:00
README.md feat(ui): implement chat completion views (#2201) 2025-05-22 22:05:54 -07:00
tsconfig.json feat: scaffolding for Llama Stack UI (#2149) 2025-05-14 17:22:46 -07:00

This is WIP.

We use shadcdn/ui Shadcn UI for the UI components.

Getting Started

First, install dependencies:

npm install

Then, run the development server:

npm run dev
# or
yarn dev
# or
pnpm dev
# or
bun dev

Open http://localhost:8322 with your browser to see the result.