litellm-mirror/litellm/llms/base_llm/base_utils.py
Krish Dholakia f770dd0c95
Support checking provider-specific /models endpoints for available models based on key (#7538)
* test(test_utils.py): initial test for valid models

Addresses https://github.com/BerriAI/litellm/issues/7525

* fix: test

* feat(fireworks_ai/transformation.py): support retrieving valid models from fireworks ai endpoint

* refactor(fireworks_ai/): support checking model info on `/v1/models` route

* docs(set_keys.md): update docs to clarify check llm provider api usage

* fix(watsonx/common_utils.py): support 'WATSONX_ZENAPIKEY' for iam auth

* fix(watsonx): read in watsonx token from env var

* fix: fix linting errors

* fix(utils.py): fix provider config check

* style: cleanup unused imports
2025-01-03 19:29:59 -08:00

18 lines
407 B
Python

from abc import ABC, abstractmethod
from typing import List, Optional
from litellm.types.utils import ModelInfoBase
class BaseLLMModelInfo(ABC):
@abstractmethod
def get_model_info(
self,
model: str,
existing_model_info: Optional[ModelInfoBase] = None,
) -> Optional[ModelInfoBase]:
pass
@abstractmethod
def get_models(self) -> List[str]:
pass