forked from phoenix-oss/llama-stack-mirror
Update README.md
This commit is contained in:
parent
df864ee575
commit
ab210ec59e
1 changed files with 3 additions and 6 deletions
|
@ -3,19 +3,16 @@ You can run llama stack integration tests on either a Llama Stack Library or a L
|
|||
|
||||
To test on a Llama Stack library with certain configuration, run
|
||||
```bash
|
||||
LLAMA_STACK_CONFIG=./llama_stack/templates/cerebras/run.yaml
|
||||
pytest -s -v tests/client-sdk/inference/
|
||||
LLAMA_STACK_CONFIG=./llama_stack/templates/cerebras/run.yaml pytest -s -v tests/client-sdk/inference/
|
||||
```
|
||||
or just the template name
|
||||
```bash
|
||||
LLAMA_STACK_CONFIG=together
|
||||
pytest -s -v tests/client-sdk/inference/
|
||||
LLAMA_STACK_CONFIG=together pytest -s -v tests/client-sdk/inference/
|
||||
```
|
||||
|
||||
To test on a Llama Stack endpoint, run
|
||||
```bash
|
||||
LLAMA_STACK_BASE_URL=http//localhost:8089
|
||||
pytest -s -v tests/client-sdk/inference
|
||||
LLAMA_STACK_BASE_URL=http://localhost:8089 pytest -s -v tests/client-sdk/inference
|
||||
```
|
||||
|
||||
## Report Generation
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue