llama-stack-mirror/tests/integration/responses/conftest.py
Ashwin Bharambe 1e81056a22
feat(tests): enable MCP tests in server mode (#4146)
We would like to run all OpenAI compatibility tests using only the
openai-client library. This is most friendly for contributors since they
can run tests without needing to update the client-sdks (which is
getting easier but still a long pole.)

This is the first step in enabling that -- no using "library client" for
any of the Responses tests. This seems like a reasonable trade-off since
the usage of an embeddeble library client for Responses (or any
OpenAI-compatible) behavior seems to be not very common. To do this, we
needed to enable MCP tests (which only worked in library client mode)
for server mode.
2025-11-13 07:23:23 -08:00

17 lines
589 B
Python

# Copyright (c) Meta Platforms, Inc. and affiliates.
# All rights reserved.
#
# This source code is licensed under the terms described in the LICENSE file in
# the root directory of this source tree.
import pytest
from llama_stack.core.library_client import LlamaStackAsLibraryClient
@pytest.fixture
def responses_client(compat_client):
"""Provide a client for responses tests, skipping library client mode."""
if isinstance(compat_client, LlamaStackAsLibraryClient):
pytest.skip("Responses API tests are not supported in library client mode")
return compat_client