mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-12-03 09:53:45 +00:00
After attempting local recording generation, encountered multiple environment issues: 1. Client/server version mismatches (0.3.x vs 0.4.0.dev0) 2. LlamaStackClient API changes (provider_data parameter removed) 3. Dev server network constraints (HTTP 426 errors with OpenAI API) Server logs from CI confirmed recordings are needed: - RuntimeError: Recording not found for request hash: 56ddb450d... - Tests with authorization parameter create different OpenAI request hashes Local recording generation requires complex environment setup that matches CI. Requesting reviewer assistance to generate recordings via CI infrastructure. |
||
|---|---|---|
| .. | ||
| fixtures | ||
| recordings | ||
| __init__.py | ||
| conftest.py | ||
| helpers.py | ||
| streaming_assertions.py | ||
| test_basic_responses.py | ||
| test_conversation_responses.py | ||
| test_file_search.py | ||
| test_mcp_authentication.py | ||
| test_tool_responses.py | ||