mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-12-16 21:49:27 +00:00
This PR fixes issue #3185 The code calls `await event_gen.aclose()` but OpenAI's `AsyncStream` doesn't have an `aclose()` method - it has `close()` (which is async). when clients cancel streaming requests, the server tries to clean up with: ```python await event_gen.aclose() # ❌ AsyncStream doesn't have aclose()! ``` But `AsyncStream` has never had a public `aclose()` method. The error message literally tells us: ``` AttributeError: 'AsyncStream' object has no attribute 'aclose'. Did you mean: 'close'? ^^^^^^^^ ``` ## Verification * Reproduction script [`reproduce_issue_3185.sh`](https://gist.github.com/r-bit-rry/dea4f8fbb81c446f5db50ea7abd6379b) can be used to verify the fix. * Manual checks, validation against original OpenAI library code |
||
|---|---|---|
| .. | ||
| img | ||
| providers/vector_io | ||
| deprecated-llama-stack-spec.yaml | ||
| experimental-llama-stack-spec.yaml | ||
| llama-stack-spec.yaml | ||
| openai-spec-2.3.0.yml | ||
| remote_or_local.gif | ||
| safety_system.webp | ||
| site.webmanifest | ||
| stainless-llama-stack-spec.yaml | ||