mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-12-09 03:19:20 +00:00
This keeps the prompt encoding layer in our control (see `chat_completion_request_to_prompt()` method) |
||
|---|---|---|
| .. | ||
| __init__.py | ||
| augment_messages.py | ||
| model_registry.py | ||
| openai_compat.py | ||