llama-stack-mirror/llama_stack/providers/inline/agents
Ashwin Bharambe 4fec49dfdb
feat(responses): add include parameter (#3115)
Well our Responses tests use it so we better include it in the API, no?

I discovered it because I want to make sure `llama-stack-client` can be
used always instead of `openai-python` as the client (we do want to be
_truly_ compatible.)
2025-08-12 10:24:01 -07:00
..
meta_reference feat(responses): add include parameter (#3115) 2025-08-12 10:24:01 -07:00
__init__.py add missing inits 2024-11-08 17:54:24 -08:00