llama-stack-mirror/llama_stack/providers/tests
Ashwin Bharambe 05e73d12b3 introduce openai_compat with the completions (not chat-completions) API
This keeps the prompt encoding layer in our control (see
`chat_completion_request_to_prompt()` method)
2024-10-08 17:23:42 -07:00
..
inference introduce openai_compat with the completions (not chat-completions) API 2024-10-08 17:23:42 -07:00
memory weaviate fixes, test now passes 2024-10-08 17:23:02 -07:00
__init__.py Add inference test 2024-10-08 17:23:02 -07:00
resolver.py Add really basic testing for memory API 2024-10-08 17:23:02 -07:00