forked from phoenix-oss/llama-stack-mirror
tests: pytest -v -s -m "ollama" llama_stack/providers/tests/inference/test_text_inference.py pytest -v -s -m vllm_remote llama_stack/providers/tests/inference/test_text_inference.py --env VLLM_URL="http://localhost:9798/v1" --------- |
||
---|---|---|
.. | ||
__init__.py | ||
conftest.py | ||
fixtures.py | ||
pasta.jpeg | ||
test_model_registration.py | ||
test_prompt_adapter.py | ||
test_text_inference.py | ||
test_vision_inference.py | ||
utils.py |