forked from phoenix-oss/llama-stack-mirror
rename test_inference -> test_text_inference
This commit is contained in:
parent
3b54ce3499
commit
7c340f0236
2 changed files with 4 additions and 4 deletions
|
@ -28,21 +28,21 @@ We have the following orthogonal parametrizations (pytest "marks") for inference
|
||||||
|
|
||||||
If you want to run a test with the llama_8b model with fireworks, you can use:
|
If you want to run a test with the llama_8b model with fireworks, you can use:
|
||||||
```bash
|
```bash
|
||||||
pytest -s -v llama_stack/providers/tests/inference/test_inference.py \
|
pytest -s -v llama_stack/providers/tests/inference/test_text_inference.py \
|
||||||
-m "fireworks and llama_8b" \
|
-m "fireworks and llama_8b" \
|
||||||
--env FIREWORKS_API_KEY=<...>
|
--env FIREWORKS_API_KEY=<...>
|
||||||
```
|
```
|
||||||
|
|
||||||
You can make it more complex to run both llama_8b and llama_3b on Fireworks, but only llama_3b with Ollama:
|
You can make it more complex to run both llama_8b and llama_3b on Fireworks, but only llama_3b with Ollama:
|
||||||
```bash
|
```bash
|
||||||
pytest -s -v llama_stack/providers/tests/inference/test_inference.py \
|
pytest -s -v llama_stack/providers/tests/inference/test_text_inference.py \
|
||||||
-m "fireworks or (ollama and llama_3b)" \
|
-m "fireworks or (ollama and llama_3b)" \
|
||||||
--env FIREWORKS_API_KEY=<...>
|
--env FIREWORKS_API_KEY=<...>
|
||||||
```
|
```
|
||||||
|
|
||||||
Finally, you can override the model completely by doing:
|
Finally, you can override the model completely by doing:
|
||||||
```bash
|
```bash
|
||||||
pytest -s -v llama_stack/providers/tests/inference/test_inference.py \
|
pytest -s -v llama_stack/providers/tests/inference/test_text_inference.py \
|
||||||
-m fireworks \
|
-m fireworks \
|
||||||
--inference-model "Llama3.1-70B-Instruct" \
|
--inference-model "Llama3.1-70B-Instruct" \
|
||||||
--env FIREWORKS_API_KEY=<...>
|
--env FIREWORKS_API_KEY=<...>
|
||||||
|
|
|
@ -19,7 +19,7 @@ from .utils import group_chunks
|
||||||
|
|
||||||
# How to run this test:
|
# How to run this test:
|
||||||
#
|
#
|
||||||
# pytest -v -s llama_stack/providers/tests/inference/test_inference.py
|
# pytest -v -s llama_stack/providers/tests/inference/test_text_inference.py
|
||||||
# -m "(fireworks or ollama) and llama_3b"
|
# -m "(fireworks or ollama) and llama_3b"
|
||||||
# --env FIREWORKS_API_KEY=<your_api_key>
|
# --env FIREWORKS_API_KEY=<your_api_key>
|
||||||
|
|
Loading…
Add table
Add a link
Reference in a new issue