llama-stack-mirror/llama_stack
Ashwin Bharambe 05e73d12b3 introduce openai_compat with the completions (not chat-completions) API
This keeps the prompt encoding layer in our control (see
`chat_completion_request_to_prompt()` method)
2024-10-08 17:23:42 -07:00
..
apis Separate chat_completion stream and non-stream implementations 2024-10-08 17:23:40 -07:00
cli A few bug fixes for covering corner cases 2024-10-08 17:23:02 -07:00
distribution Separate chat_completion stream and non-stream implementations 2024-10-08 17:23:40 -07:00
providers introduce openai_compat with the completions (not chat-completions) API 2024-10-08 17:23:42 -07:00
scripts Add a test for CLI, but not fully done so disabled 2024-09-19 13:27:07 -07:00
__init__.py API Updates (#73) 2024-09-17 19:51:35 -07:00