llama-stack-mirror/tests/unit/providers/utils
Roy Belio c574db5f1d
fix(inference): AttributeError in streaming response cleanup (#4236)
This PR fixes issue #3185 
The code calls `await event_gen.aclose()` but OpenAI's `AsyncStream`
doesn't have an `aclose()` method - it has `close()` (which is async).
when clients cancel streaming requests, the server tries to clean up
with:

```python
await event_gen.aclose()  #  AsyncStream doesn't have aclose()!
```

But `AsyncStream` has never had a public `aclose()` method. The error
message literally tells us:

```
AttributeError: 'AsyncStream' object has no attribute 'aclose'. Did you mean: 'close'?
                                                                            ^^^^^^^^
```

## Verification
* Reproduction script
[`reproduce_issue_3185.sh`](https://gist.github.com/r-bit-rry/dea4f8fbb81c446f5db50ea7abd6379b)
can be used to verify the fix.
*   Manual checks, validation against original OpenAI library code
2025-12-14 07:51:09 -05:00
..
inference fix: enforce allowed_models during inference requests (#4197) 2025-11-19 14:49:44 -08:00
memory fix: rename llama_stack_api dir (#4155) 2025-11-13 15:04:36 -08:00
__init__.py fix: add check for interleavedContent (#1973) 2025-05-06 09:55:07 -07:00
test_form_data.py fix(expires_after): make sure multipart/form-data is properly parsed (#3612) 2025-09-30 16:14:03 -04:00
test_model_registry.py fix: rename llama_stack_api dir (#4155) 2025-11-13 15:04:36 -08:00
test_openai_compat_conversion.py feat(tools)!: substantial clean up of "Tool" related datatypes (#3627) 2025-10-02 15:12:03 -07:00
test_openai_mixin_streaming.py fix(inference): AttributeError in streaming response cleanup (#4236) 2025-12-14 07:51:09 -05:00
test_scheduler.py chore: default to pytest asyncio-mode=auto (#2730) 2025-07-11 13:00:24 -07:00