forked from phoenix-oss/llama-stack-mirror
Tests: LLAMA_STACK_CONFIG=http://localhost:5002 pytest -s -v tests/integration/inference --safety-shield meta-llama/Llama-Guard-3-8B --vision-model meta-llama/Llama-4-Scout-17B-16E-Instruct --text-model meta-llama/Llama-4-Scout-17B-16E-Instruct LLAMA_STACK_CONFIG=http://localhost:5002 pytest -s -v tests/integration/inference --safety-shield meta-llama/Llama-Guard-3-8B --vision-model Llama-4-Maverick-17B-128E-Instruct --text-model Llama-4-Maverick-17B-128E-Instruct Co-authored-by: Eric Huang <erichuang@fb.com> |
||
---|---|---|
.. | ||
llama3 | ||
llama3_1 | ||
llama3_2 | ||
llama3_3 | ||
llama4 | ||
resources | ||
__init__.py | ||
checkpoint.py | ||
datatypes.py | ||
hadamard_utils.py | ||
prompt_format.py | ||
quantize_impls.py | ||
sku_list.py | ||
sku_types.py |