llama-stack-mirror/llama_stack/providers/inline/agents/meta_reference
Ashwin Bharambe 3d90117891
chore(tests): fix responses and vector_io tests (#3119)
Some fixes to MCP tests. And a bunch of fixes for Vector providers.

I also enabled a bunch of Vector IO tests to be used with
`LlamaStackLibraryClient`

## Test Plan

Run Responses tests with llama stack library client:
```
pytest -s -v tests/integration/non_ci/responses/ --stack-config=server:starter \
   --text-model openai/gpt-4o \
  --embedding-model=sentence-transformers/all-MiniLM-L6-v2 \
  -k "client_with_models"
```

Do the same with `-k openai_client`

The rest should be taken care of by CI.
2025-08-12 16:15:53 -07:00
..
__init__.py chore(rename): move llama_stack.distribution to llama_stack.core (#2975) 2025-07-30 23:30:53 -07:00
agent_instance.py chore: standardize session not found error (#3031) 2025-08-04 13:12:02 -07:00
agents.py feat(responses): add include parameter (#3115) 2025-08-12 10:24:01 -07:00
config.py feat: add list responses API (#2233) 2025-05-23 13:16:48 -07:00
openai_responses.py chore(tests): fix responses and vector_io tests (#3119) 2025-08-12 16:15:53 -07:00
persistence.py chore: standardize session not found error (#3031) 2025-08-04 13:12:02 -07:00
safety.py chore(api): add mypy coverage to meta_reference_safety (#2661) 2025-07-09 10:22:34 +02:00