llama-stack-mirror/llama_stack/providers/remote/inference/ollama
Matthew Farrellee 65c4ffca28 feat(internal): add image_url download feature to OpenAIMixin
simplify Ollama inference adapter by -
 - moving image_url download code to OpenAIMixin
 - being a ModelRegistryHelper instead of having one (mypy blocks check_model_availability method assignment)

testing -
 - add unit tests for new download feature
 - add integration tests for openai_chat_completion w/ image_url (close test gap)
2025-09-22 06:56:56 -04:00
..
__init__.py fix: Ollama should be optional in starter distro (#2482) 2025-06-25 15:54:00 +02:00
config.py feat(registry): make the Stack query providers for model listing (#2862) 2025-07-24 10:39:53 -07:00
models.py fix: Safety in starter (#2731) 2025-07-14 15:07:40 -07:00
ollama.py feat(internal): add image_url download feature to OpenAIMixin 2025-09-22 06:56:56 -04:00