chore: standardize unsupported model error #2517

- llama_stack/exceptions.py: Add UnsupportedModelError class
- remote inference ollama.py and utils/inference/model_registry.py:
Changed ValueError in favor of UnsupportedModelError
- utils/inference/litellm_openai_mixin.py: remote register_model func.
Now uses parent class ModelRegistry's func

Closes #2517
This commit is contained in:
Rohan Awhad 2025-06-25 11:10:58 -04:00
parent cfee63bd0d
commit 7ccf83fb74
4 changed files with 17 additions and 13 deletions

13
llama_stack/exceptions.py Normal file
View file

@ -0,0 +1,13 @@
# Copyright (c) Meta Platforms, Inc. and affiliates.
# All rights reserved.
#
# This source code is licensed under the terms described in the LICENSE file in
# the root directory of this source tree.
class UnsupportedModelError(ValueError):
"""raised when model is not present in the list of supported models"""
def __init__(self, model_name: str, supported_models_list: list[str]):
message = f"'{model_name}' model is not supported. Supported models are: {', '.join(supported_models_list)}"
super().__init__(message)