llama-stack-mirror/llama_stack/providers/remote/inference/llama_openai_compat
2025-10-04 12:50:22 -04:00
..
__init__.py feat: introduce APIs for retrieving chat completion requests (#2145) 2025-05-18 21:43:19 -07:00
config.py chore: use remoteinferenceproviderconfig for remote inference providers (#3668) 2025-10-03 08:48:42 -07:00
llama.py chore: inference=remote::llama-openai-compat does not support /v1/completion 2025-10-04 12:50:22 -04:00