forked from phoenix-oss/llama-stack-mirror
tests: pytest -v -s -m "ollama" llama_stack/providers/tests/inference/test_text_inference.py pytest -v -s -m vllm_remote llama_stack/providers/tests/inference/test_text_inference.py --env VLLM_URL="http://localhost:9798/v1" --------- |
||
---|---|---|
.. | ||
bedrock | ||
datasetio | ||
inference | ||
kvstore | ||
memory | ||
scoring | ||
telemetry | ||
__init__.py |