feat: allow ollama to use 'latest' if available but not specified

ollama's CLI supports running models via commands such as 'ollama run llama3.2'
this syntax does not work with the INFERENCE_MODEL llamastack var as currently
specifying a tag such as 'latest' is required

this commit will check to see if the 'latest' model is available and use that
model if a user passes a model name without a tag but the 'latest' is available
in ollama

Signed-off-by: Nathan Weinberg <nweinber@redhat.com>
This commit is contained in:
Nathan Weinberg 2025-04-08 15:56:19 -04:00
parent 2fcb70b789
commit 0e5574cf9d

View file

@ -313,6 +313,12 @@ class OllamaInferenceAdapter(Inference, ModelsProtocolPrivate):
response = await self.client.list()
available_models = [m["model"] for m in response["models"]]
if model.provider_resource_id not in available_models:
available_models_latest = [m["model"].split(":latest")[0] for m in response["models"]]
if model.provider_resource_id in available_models_latest:
logger.warning(
f"Imprecise provider resource id was used but 'latest' is available in Ollama - using '{model.provider_resource_id}:latest'"
)
return model
raise ValueError(
f"Model '{model.provider_resource_id}' is not available in Ollama. Available models: {', '.join(available_models)}"
)