llama-stack/tests
Ashwin Bharambe 981fc3c93c
fix(test): no need to specify tool prompt format explicitly in tests (#1295)
# What does this PR do?

No need to have complex tool prompt format related machinery in the
tests.

[//]: # (If resolving an issue, uncomment and update the line below)
[//]: # (Closes #[issue-number])

## Test Plan

```bash
LLAMA_STACK_CONFIG=ollama pytest -s -v tests/client-sdk/inference/test_text_inference.py --inference-model=meta-llama/Llama-3.2-3B-Instruct --vision-inference-model=""
```

[//]: # (## Documentation)
2025-02-27 10:09:57 -08:00
..
client-sdk fix(test): no need to specify tool prompt format explicitly in tests (#1295) 2025-02-27 10:09:57 -08:00