LiteLLM Common Base LLM Config (pt.3): Move all OAI compatible providers to base llm config (#7148)

* refactor(fireworks_ai/): inherit from openai like base config

refactors fireworks ai to use a common config

* test: fix import in test

* refactor(watsonx/): refactor watsonx to use llm base config

refactors chat + completion routes to base config path

* fix: fix linting error

* refactor: inherit base llm config for oai compatible routes

* test: fix test

* test: fix test
This commit is contained in:
Krish Dholakia 2024-12-10 17:12:42 -08:00 committed by GitHub
parent 4eeaaeeacd
commit df12f87a64
7 changed files with 107 additions and 41 deletions

View file

@ -45,21 +45,7 @@ class DatabricksConfig(OpenAIGPTConfig):
@classmethod
def get_config(cls):
return {
k: v
for k, v in cls.__dict__.items()
if not k.startswith("__")
and not isinstance(
v,
(
types.FunctionType,
types.BuiltinFunctionType,
classmethod,
staticmethod,
),
)
and v is not None
}
return super().get_config()
def get_required_params(self) -> List[ProviderField]:
"""For a given provider, return it's required fields with a description"""