llama-stack-mirror/llama_stack/providers
Ashwin Bharambe 8638537d14
feat(responses): stream progress of tool calls (#3135)
# What does this PR do?
Enhances tool execution streaming by adding support for real-time progress events during tool calls. This implementation adds streaming events for MCP and web search tools, including in-progress, searching, completed, and failed states. 

The refactored `_execute_tool_call` method now returns an async iterator that yields streaming events throughout the tool execution lifecycle.

## Test Plan
Updated the integration test `test_response_streaming_multi_turn_tool_execution` to verify the presence and structure of new streaming events, including:
- Checking for MCP in-progress and completed events
- Verifying that progress events contain required fields (item_id, output_index, sequence_number)
- Ensuring completed events have the necessary sequence_number field
2025-08-13 16:31:25 -07:00
..
inline feat(responses): stream progress of tool calls (#3135) 2025-08-13 16:31:25 -07:00
registry chore(tests): fix responses and vector_io tests (#3119) 2025-08-12 16:15:53 -07:00
remote fix: Dell distribution missing kvstore (#3113) 2025-08-13 06:18:25 -07:00
utils chore(tests): fix responses and vector_io tests (#3119) 2025-08-12 16:15:53 -07:00
__init__.py API Updates (#73) 2024-09-17 19:51:35 -07:00
datatypes.py feat: create unregister shield API endpoint in Llama Stack (#2853) 2025-08-05 07:33:46 -07:00