regen markdown

This commit is contained in:
Suraj Subramanian 2025-04-07 11:29:03 -07:00
parent 18c68c4950
commit f436348124

View file

@ -22,7 +22,7 @@ There are 4 different roles that are supported by Llama 4
- `system`: Sets the context in which to interact with the AI model. It typically includes rules, guidelines, or necessary information that helps the model respond effectively.
- `user`: Represents the human interacting with the model. It includes the inputs, commands, and questions to the model.
- `assistant`: Represents the response generated by the AI model based on the context provided in the `system`, `tool` and `user` prompts.
- `tool`: Represents the output of a tool call when sent back to the model from the executor. (The actual token used by the model is `<|ipython|>`.)
- `tool`: Represents the output of a tool call when sent back to the model from the executor. Note that the role name used in the prompt template is `ipython`; scroll down to the last example to see how this is used.
# Llama 4 Instruct Model
@ -319,7 +319,7 @@ The top 2 latest trending songs are:
##### Notes
- Tool outputs should be passed back to the model in the `tool` role, which uses the `<|ipython|>` tag.
- Tool outputs should be passed back to the model in the `tool` (a.k.a. `ipython`) role.
- The model parses the tool output contents until it encounters the `<|eom|>` tag. It uses this to synthesize an appropriate response to the query.