llama-stack-mirror/tests/integration/fixtures
Ashwin Bharambe 30ba8c8655
fix(responses): sync conversation before yielding terminal events in streaming (#3888)
Move conversation sync logic before yield to ensure it executes even
when
streaming consumers break early after receiving response.completed
event.

## Test Plan

```
OLLAMA_URL=http://localhost:11434 \
  pytest -sv tests/integration/responses/ \
  --stack-config server:ci-tests \
  --text-model ollama/llama3.2:3b-instruct-fp16 \
  --inference-mode live \
  -k conversation_multi
```

This test now passes.
2025-10-22 14:31:12 -07:00
..
__init__.py refactor(test): introduce --stack-config and simplify options (#1404) 2025-03-05 17:02:02 -08:00
common.py fix(responses): sync conversation before yielding terminal events in streaming (#3888) 2025-10-22 14:31:12 -07:00