mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-12-27 21:02:00 +00:00
The bulk of the change here is making the naming and contents of the conversion to/from Responses API inputs -> Chat Completion API messages and Chat Completion API choices -> Responses API outputs more clear with some code comments, method renaming, and slight refactoring. There are also some other minor changes, like moving a pydantic model from the api/ to the implementation since it's not actually exposed via the API, as well as making some if/else usage more clear. Signed-off-by: Ben Browning <bbrownin@redhat.com> |
||
|---|---|---|
| .. | ||
| __init__.py | ||
| agents.py | ||
| openai_responses.py | ||