litellm-mirror/litellm/llms/empower/chat
Krish Dholakia 1e87782215
All checks were successful
Read Version from pyproject.toml / read-version (push) Successful in 45s
LiteLLM Common Base LLM Config (pt.3): Move all OAI compatible providers to base llm config (#7148)
* refactor(fireworks_ai/): inherit from openai like base config

refactors fireworks ai to use a common config

* test: fix import in test

* refactor(watsonx/): refactor watsonx to use llm base config

refactors chat + completion routes to base config path

* fix: fix linting error

* refactor: inherit base llm config for oai compatible routes

* test: fix test

* test: fix test
2024-12-10 17:12:42 -08:00
..
transformation.py LiteLLM Common Base LLM Config (pt.3): Move all OAI compatible providers to base llm config (#7148) 2024-12-10 17:12:42 -08:00