Improved O3 + Azure O3 support (#8181)
All checks were successful
Read Version from pyproject.toml / read-version (push) Successful in 13s

* fix: support azure o3 model family for fake streaming workaround (#8162)

* fix: support azure o3 model family for fake streaming workaround

* refactor: rename helper to is_o_series_model for clarity

* update function calling parameters for o3 models (#8178)

* refactor(o1_transformation.py): refactor o1 config to be o series config, expand o series model check to o3

ensures max_tokens is correctly translated for o3

* feat(openai/): refactor o1 files to be 'o_series' files

expands naming to cover o3

* fix(azure/chat/o1_handler.py): azure openai is an instance of openai - was causing resets

* test(test_azure_o_series.py): assert stream faked for azure o3 mini

Resolves https://github.com/BerriAI/litellm/pull/8162

* fix(o1_transformation.py): fix o1 transformation logic to handle explicit o1_series routing

* docs(azure.md): update doc with `o_series/` model name

---------

Co-authored-by: byrongrogan <47910641+byrongrogan@users.noreply.github.com>
Co-authored-by: Low Jian Sheng <15527690+lowjiansheng@users.noreply.github.com>
This commit is contained in:
Krish Dholakia 2025-02-01 09:52:28 -08:00 committed by GitHub
parent 91ed05df29
commit 23f458d2da
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
14 changed files with 211 additions and 37 deletions

View file

@ -47,8 +47,11 @@ from litellm.utils import (
from ...types.llms.openai import *
from ..base import BaseLLM
from .chat.o_series_transformation import OpenAIOSeriesConfig
from .common_utils import OpenAIError, drop_params_from_unprocessable_entity_error
openaiOSeriesConfig = OpenAIOSeriesConfig()
class MistralEmbeddingConfig:
"""
@ -174,8 +177,8 @@ class OpenAIConfig(BaseConfig):
Returns:
list: List of supported openai parameters
"""
if litellm.openAIO1Config.is_model_o1_reasoning_model(model=model):
return litellm.openAIO1Config.get_supported_openai_params(model=model)
if openaiOSeriesConfig.is_model_o_series_model(model=model):
return openaiOSeriesConfig.get_supported_openai_params(model=model)
elif litellm.openAIGPTAudioConfig.is_model_gpt_audio_model(model=model):
return litellm.openAIGPTAudioConfig.get_supported_openai_params(model=model)
else:
@ -203,8 +206,8 @@ class OpenAIConfig(BaseConfig):
drop_params: bool,
) -> dict:
""" """
if litellm.openAIO1Config.is_model_o1_reasoning_model(model=model):
return litellm.openAIO1Config.map_openai_params(
if openaiOSeriesConfig.is_model_o_series_model(model=model):
return openaiOSeriesConfig.map_openai_params(
non_default_params=non_default_params,
optional_params=optional_params,
model=model,