llama-stack-mirror/docs/static
Ashwin Bharambe d47f2c0ba8 feat(responses)!: improve responses + conversations implementations
This PR updates the Conversation item related types and improves a
couple critical parts of the implemenation:

- it creates a streaming output item for the final assistant message output by
  the model. until now we only added content parts and included that
  message in the final response.

- rewrites the conversation update code completely to account for items
  other than messages (tool calls, outputs, etc.)
2025-10-14 14:42:12 -07:00
..
img docs: update OG image (#3669) 2025-10-03 10:22:54 -07:00
providers/vector_io docs: static content migration (#3535) 2025-09-24 14:08:50 -07:00
deprecated-llama-stack-spec.html feat(responses)!: improve responses + conversations implementations 2025-10-14 14:42:12 -07:00
deprecated-llama-stack-spec.yaml feat(responses)!: improve responses + conversations implementations 2025-10-14 14:42:12 -07:00
experimental-llama-stack-spec.html chore(api)!: BREAKING CHANGE: remove ALL telemetry APIs (#3740) 2025-10-14 13:48:40 -07:00
experimental-llama-stack-spec.yaml chore(api)!: BREAKING CHANGE: remove ALL telemetry APIs (#3740) 2025-10-14 13:48:40 -07:00
llama-stack-spec.html feat(responses)!: improve responses + conversations implementations 2025-10-14 14:42:12 -07:00
llama-stack-spec.yaml feat(responses)!: improve responses + conversations implementations 2025-10-14 14:42:12 -07:00
remote_or_local.gif docs: static content migration (#3535) 2025-09-24 14:08:50 -07:00
safety_system.webp docs: static content migration (#3535) 2025-09-24 14:08:50 -07:00
site.webmanifest docs: add favicon and mobile styling (#3650) 2025-10-02 10:42:54 +02:00
stainless-llama-stack-spec.html feat(responses)!: improve responses + conversations implementations 2025-10-14 14:42:12 -07:00
stainless-llama-stack-spec.yaml feat(responses)!: improve responses + conversations implementations 2025-10-14 14:42:12 -07:00