mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-12-17 22:47:14 +00:00
- Add content part events (response.content_part.added/done) for granular text streaming - Implement MCP-specific argument streaming (response.mcp_call.arguments.delta/done) - Differentiate between MCP and function call streaming events - Update unit and integration tests for new streaming events - Ensure proper event ordering and OpenAI spec compliance 🤖 Generated with [Claude Code](https://claude.ai/code) Co-Authored-By: Claude <noreply@anthropic.com> |
||
|---|---|---|
| .. | ||
| inline | ||
| registry | ||
| remote | ||
| utils | ||
| __init__.py | ||
| datatypes.py | ||