mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-12-19 04:19:40 +00:00
o Handle Ollama format where models are nested under response['body']['models'] o Fall back to OpenAI format where models are directly in response['body'] Closes: #3457 Signed-off-by: Derek Higgins <derekh@redhat.com> |
||
|---|---|---|
| .. | ||
| __init__.py | ||
| inference_recorder.py | ||