mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-12-27 21:31:59 +00:00
This adjusts the restoration of previous responses to prepend them to the list of Responses API inputs instead of our converted list of Chat Completion messages. This matches the expected behavior of the Responses API, and I misinterpreted the nuances here in the initial implementation. Signed-off-by: Ben Browning <bbrownin@redhat.com> |
||
|---|---|---|
| .. | ||
| __init__.py | ||
| agent_instance.py | ||
| agents.py | ||
| config.py | ||
| openai_responses.py | ||
| persistence.py | ||
| safety.py | ||