mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-12-16 23:32:38 +00:00
This PR fixes issue #3185 The code calls `await event_gen.aclose()` but OpenAI's `AsyncStream` doesn't have an `aclose()` method - it has `close()` (which is async). when clients cancel streaming requests, the server tries to clean up with: ```python await event_gen.aclose() # ❌ AsyncStream doesn't have aclose()! ``` But `AsyncStream` has never had a public `aclose()` method. The error message literally tells us: ``` AttributeError: 'AsyncStream' object has no attribute 'aclose'. Did you mean: 'close'? ^^^^^^^^ ``` ## Verification * Reproduction script [`reproduce_issue_3185.sh`](https://gist.github.com/r-bit-rry/dea4f8fbb81c446f5db50ea7abd6379b) can be used to verify the fix. * Manual checks, validation against original OpenAI library code |
||
|---|---|---|
| .. | ||
| inference | ||
| memory | ||
| __init__.py | ||
| test_form_data.py | ||
| test_model_registry.py | ||
| test_openai_compat_conversion.py | ||
| test_openai_mixin_streaming.py | ||
| test_scheduler.py | ||