refactor: move tests/client-sdk to tests/api (#1376)

This PR moves the client-sdk tests to the api directory to better
reflect their purpose and improve code organization.
This commit is contained in:
Ashwin Bharambe 2025-03-03 17:28:12 -08:00 committed by GitHub
parent c3155cb1bc
commit 5736c7d682
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
29 changed files with 13 additions and 13 deletions

View file

@ -80,7 +80,7 @@ LLAMA_STACK_CONFIG=
And then use this dotenv file when running client SDK tests via the following:
```bash
$ uv run --env-file .env -- pytest -v tests/client-sdk/inference/test_text_inference.py
$ uv run --env-file .env -- pytest -v tests/api/inference/test_text_inference.py
```
## Pre-commit Hooks

View file

@ -21,7 +21,7 @@ Here are some example PRs to help you get started:
- Create integration tests that use real provider instances and configurations
- For remote services, test actual API interactions
- Avoid mocking at the provider level since adapter layers tend to be thin
- Reference examples in {repopath}`tests/client-sdk`
- Reference examples in {repopath}`tests/api`
### 2. Unit Testing (Optional)
- Add unit tests for provider-specific functionality

View file

@ -39,7 +39,7 @@ class TestVisionModelInference:
ImageContentItem(
image=dict(
url=URL(
uri="https://raw.githubusercontent.com/meta-llama/llama-stack/main/tests/client-sdk/inference/dog.png"
uri="https://raw.githubusercontent.com/meta-llama/llama-stack/main/tests/api/inference/dog.png"
)
)
),
@ -80,7 +80,7 @@ class TestVisionModelInference:
ImageContentItem(
image=dict(
url=URL(
uri="https://raw.githubusercontent.com/meta-llama/llama-stack/main/tests/client-sdk/inference/dog.png"
uri="https://raw.githubusercontent.com/meta-llama/llama-stack/main/tests/api/inference/dog.png"
)
)
),

View file

@ -11,14 +11,14 @@ from pathlib import Path
import pytest
"""
Script for running client-sdk on AsyncLlamaStackAsLibraryClient with templates
Script for running api on AsyncLlamaStackAsLibraryClient with templates
Assuming directory structure:
- llama-stack
- llama_stack
- scripts
- tests
- client-sdk
- api
Example command:
@ -29,7 +29,7 @@ python llama_stack/scripts/run_client_sdk_tests.py --templates together firework
"""
REPO_ROOT = Path(__file__).parent.parent.parent
CLIENT_SDK_TESTS_RELATIVE_PATH = "tests/client-sdk/"
CLIENT_SDK_TESTS_RELATIVE_PATH = "tests/api/"
def main(parser: argparse.ArgumentParser):

View file

@ -3,23 +3,23 @@ You can run llama stack integration tests on either a Llama Stack Library or a L
To test on a Llama Stack library with certain configuration, run
```bash
LLAMA_STACK_CONFIG=./llama_stack/templates/cerebras/run.yaml pytest -s -v tests/client-sdk/inference/
LLAMA_STACK_CONFIG=./llama_stack/templates/cerebras/run.yaml pytest -s -v tests/api/inference/
```
or just the template name
```bash
LLAMA_STACK_CONFIG=together pytest -s -v tests/client-sdk/inference/
LLAMA_STACK_CONFIG=together pytest -s -v tests/api/inference/
```
To test on a Llama Stack endpoint, run
```bash
LLAMA_STACK_BASE_URL=http://localhost:8089 pytest -s -v tests/client-sdk/inference
LLAMA_STACK_BASE_URL=http://localhost:8089 pytest -s -v tests/api/inference
```
## Report Generation
To generate a report, run with `--report` option
```bash
LLAMA_STACK_CONFIG=together pytest -s -v report.md tests/client-sdk/ --report
LLAMA_STACK_CONFIG=together pytest -s -v report.md tests/api/ --report
```
## Common options

View file

Before

Width:  |  Height:  |  Size: 415 KiB

After

Width:  |  Height:  |  Size: 415 KiB

Before After
Before After

View file

@ -35,7 +35,7 @@ def test_image_chat_completion_non_streaming(client_with_models, vision_model_id
"type": "image",
"image": {
"url": {
"uri": "https://raw.githubusercontent.com/meta-llama/llama-stack/main/tests/client-sdk/inference/dog.png"
"uri": "https://raw.githubusercontent.com/meta-llama/llama-stack/main/tests/api/inference/dog.png"
},
},
},
@ -63,7 +63,7 @@ def test_image_chat_completion_streaming(client_with_models, vision_model_id):
"type": "image",
"image": {
"url": {
"uri": "https://raw.githubusercontent.com/meta-llama/llama-stack/main/tests/client-sdk/inference/dog.png"
"uri": "https://raw.githubusercontent.com/meta-llama/llama-stack/main/tests/api/inference/dog.png"
},
},
},

View file

Before

Width:  |  Height:  |  Size: 514 KiB

After

Width:  |  Height:  |  Size: 514 KiB

Before After
Before After

View file

Before

Width:  |  Height:  |  Size: 176 KiB

After

Width:  |  Height:  |  Size: 176 KiB

Before After
Before After