mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-12-03 09:53:45 +00:00
Added a script to cleanup recordings. While doing this, moved the CI matrix generation to a separate script so there is a single source of truth for the matrix. Ran the cleanup script as: ``` PYTHONPATH=. python scripts/cleanup_recordings.py ``` Also added this as part of the pre-commit workflow to ensure that the recordings are always up to date and that no stale recordings are left in the repo.
12 lines
247 B
JSON
12 lines
247 B
JSON
{
|
|
"default": [
|
|
{"suite": "base", "setup": "ollama"},
|
|
{"suite": "vision", "setup": "ollama-vision"},
|
|
{"suite": "responses", "setup": "gpt"}
|
|
],
|
|
"schedules": {
|
|
"1 0 * * 0": [
|
|
{"suite": "base", "setup": "vllm"}
|
|
]
|
|
}
|
|
}
|