llama-stack-mirror/llama_stack
Xi Yan 194d12b304
[bugfix] fix streaming GeneratorExit exception with LlamaStackAsLibraryClient (#760)
# What does this PR do?

#### Issue
- Using Jupyter notebook with LlamaStackAsLibraryClient + streaming
gives exception
```
Exception ignored in: <async_generator object HTTP11ConnectionByteStream.__aiter__ at 0x32a95a740>
Traceback (most recent call last):
  File "/opt/anaconda3/envs/fresh/lib/python3.11/site-packages/httpcore/_async/connection_pool.py", line 404, in _aiter_
    yield part
RuntimeError: async generator ignored GeneratorExit
```

- Reproduce w/
https://github.com/meta-llama/llama-stack/blob/notebook-streaming-debug/inline.ipynb

#### Fix
- Issue likely comes from stream_across_asyncio_run_boundary closing
connection too soon when interacting in jupyter environment
- This uses an alternative way to convert AsyncStream to SyncStream
return type by sync version of LlamaStackAsLibraryClient, which calls
AsyncLlamaStackAsLibraryClient calling async impls under the hood

#### Additional changes
- Moved tracing logic into AsyncLlamaStackAsLibraryClient.request s.t.
streaming / non-streaming request for LlamaStackAsLibraryClient shares
same code

## Test Plan

- Test w/ together & fireworks & ollama with streaming and non-streaming
using notebook in:
https://github.com/meta-llama/llama-stack/blob/notebook-streaming-debug/inline.ipynb
- Note: need to restart kernel and run pip install -e . in jupyter
interpreter for local code change to take effect

<img width="826" alt="image"
src="https://github.com/user-attachments/assets/5f90985d-1aee-452c-a599-2157f5654fea"
/>


## Sources

Please link relevant resources if necessary.


## Before submitting

- [ ] This PR fixes a typo or improves the docs (you can dismiss the
other checks if that's the case).
- [ ] Ran pre-commit to handle lint / formatting issues.
- [ ] Read the [contributor
guideline](https://github.com/meta-llama/llama-stack/blob/main/CONTRIBUTING.md),
      Pull Request section?
- [ ] Updated relevant documentation.
- [ ] Wrote necessary unit or integration tests.
2025-01-14 10:58:46 -08:00
..
apis Update spec 2025-01-13 23:16:53 -08:00
cli Switch to use importlib instead of deprecated pkg_resources (#678) 2025-01-13 20:20:02 -08:00
distribution [bugfix] fix streaming GeneratorExit exception with LlamaStackAsLibraryClient (#760) 2025-01-14 10:58:46 -08:00
providers Fixes; make inference tests pass with newer tool call types 2025-01-13 23:16:53 -08:00
scripts Fix to conda env build script 2024-12-17 12:19:34 -08:00
templates Consolidating Safety tests from various places under client-sdk (#699) 2025-01-13 17:46:24 -08:00
__init__.py export LibraryClient 2024-12-13 12:08:00 -08:00