forked from phoenix-oss/llama-stack-mirror
tests: pytest -v -s -m "ollama" llama_stack/providers/tests/inference/test_text_inference.py pytest -v -s -m vllm_remote llama_stack/providers/tests/inference/test_text_inference.py --env VLLM_URL="http://localhost:9798/v1" --------- |
||
|---|---|---|
| .. | ||
| bedrock | ||
| databricks | ||
| fireworks | ||
| ollama | ||
| sample | ||
| tgi | ||
| together | ||
| vllm | ||
| __init__.py | ||