llama-stack/tests/client-sdk/inference
Ashwin Bharambe 981fc3c93c
fix(test): no need to specify tool prompt format explicitly in tests (#1295)
# What does this PR do?

No need to have complex tool prompt format related machinery in the
tests.

[//]: # (If resolving an issue, uncomment and update the line below)
[//]: # (Closes #[issue-number])

## Test Plan

```bash
LLAMA_STACK_CONFIG=ollama pytest -s -v tests/client-sdk/inference/test_text_inference.py --inference-model=meta-llama/Llama-3.2-3B-Instruct --vision-inference-model=""
```

[//]: # (## Documentation)
2025-02-27 10:09:57 -08:00
..
__init__.py [tests] add client-sdk pytests & delete client.py (#638) 2024-12-16 12:04:56 -08:00
dog.png fix vllm base64 image inference (#815) 2025-01-17 17:07:28 -08:00
test_embedding.py fix: remove list of list tests, no longer relevant after #1161 (#1205) 2025-02-21 08:07:35 -08:00
test_text_inference.py fix(test): no need to specify tool prompt format explicitly in tests (#1295) 2025-02-27 10:09:57 -08:00
test_vision_inference.py feat(providers): support non-llama models for inference providers (#1200) 2025-02-21 13:21:28 -08:00