mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-12-08 19:10:56 +00:00
This keeps the prompt encoding layer in our control (see `chat_completion_request_to_prompt()` method) |
||
|---|---|---|
| .. | ||
| bedrock | ||
| databricks | ||
| fireworks | ||
| ollama | ||
| sample | ||
| tgi | ||
| together | ||
| __init__.py | ||