llama-stack-mirror/tests/integration/responses
Charlie Doern d6b915ce0a Merge remote-tracking branch 'upstream/main' into api-pkg
Signed-off-by: Charlie Doern <cdoern@redhat.com>
2025-11-12 13:54:09 -05:00
..
fixtures fix(tests): add OpenAI client connection cleanup to prevent CI hangs (#4119) 2025-11-12 12:17:13 -05:00
recordings feat: split API and provider specs into separate llama-stack-api pkg 2025-11-12 09:19:40 -05:00
__init__.py feat(tests): introduce a test "suite" concept to encompass dirs, options (#3339) 2025-09-05 13:58:49 -07:00
helpers.py feat(responses)!: improve responses + conversations implementations (#3810) 2025-10-15 09:36:11 -07:00
streaming_assertions.py feat(responses)!: add in_progress, failed, content part events (#3765) 2025-10-10 07:27:34 -07:00
test_basic_responses.py fix(responses): fixes, re-record tests (#3820) 2025-10-15 16:37:42 -07:00
test_conversation_responses.py fix(tests): add OpenAI client connection cleanup to prevent CI hangs (#4119) 2025-11-12 12:17:13 -05:00
test_file_search.py chore: Stack server no longer depends on llama-stack-client (#4094) 2025-11-07 09:54:09 -08:00
test_tool_responses.py chore: Stack server no longer depends on llama-stack-client (#4094) 2025-11-07 09:54:09 -08:00