llama-stack-mirror/llama_stack/testing
Matthew Farrellee 01bdcce4d2
chore(recorder): update mocks to be closer to non-mock environment (#3442)
# What does this PR do?

the @required_args decorator in openai-python is masking the async
nature of the {AsyncCompletions,chat.AsyncCompletions}.create method.
see https://github.com/openai/openai-python/issues/996

this means two things -

 0. we cannot use iscoroutine in the recorder to detect async vs non
 1. our mocks are inappropriately introducing identifiable async

for (0), we update the iscoroutine check w/ detection of /v1/models,
which is the only non-async function we mock & record.

for (1), we could leave everything as is and assume (0) will catch
errors. to be defensive, we update the unit tests to mock below create
methods, allowing the true openai-python create() methods to be tested.
2025-09-15 15:25:53 -04:00
..
__init__.py feat(tests): introduce inference record/replay to increase test reliability (#2941) 2025-07-29 12:41:31 -07:00
inference_recorder.py chore(recorder): update mocks to be closer to non-mock environment (#3442) 2025-09-15 15:25:53 -04:00