add azure o1 pricing (#7715)

* build(model_prices_and_context_window.json): add azure o1 pricing

Closes https://github.com/BerriAI/litellm/issues/7712

* refactor: replace regex with string method for whitespace check in stop-sequences handling (#7713)

* Allows overriding keep_alive time in ollama (#7079)

* Allows overriding keep_alive time in ollama

* Also adds to ollama_chat

* Adds some info on the docs about this parameter

* fix: together ai warning (#7688)

Co-authored-by: Carl Senze <carl.senze@aleph-alpha.com>

* fix(proxy_server.py): handle config containing thread locked objects when using get_config_state

* fix(proxy_server.py): add exception to debug

* build(model_prices_and_context_window.json): update 'supports_vision' for azure o1

---------

Co-authored-by: Wolfram Ravenwolf <52386626+WolframRavenwolf@users.noreply.github.com>
Co-authored-by: Regis David Souza Mesquita <github@rdsm.dev>
Co-authored-by: Carl <45709281+capsenz@users.noreply.github.com>
Co-authored-by: Carl Senze <carl.senze@aleph-alpha.com>
This commit is contained in:
Krish Dholakia 2025-01-12 18:15:35 -08:00 committed by GitHub
parent f778865836
commit 01e2e26bd1
8 changed files with 67 additions and 5 deletions

View file

@ -1107,6 +1107,29 @@ def test_proxy_config_state_post_init_callback_call():
assert config["litellm_settings"]["default_team_settings"][0]["team_id"] == "test"
def test_proxy_config_state_get_config_state_error():
"""
Ensures that get_config_state does not raise an error when the config is not a valid dictionary
"""
from litellm.proxy.proxy_server import ProxyConfig
import threading
test_config = {
"callback_list": [
{
"lock": threading.RLock(), # This will cause the deep copy to fail
"name": "test_callback",
}
],
"model_list": ["gpt-4", "claude-3"],
}
pc = ProxyConfig()
pc.config = test_config
config = pc.get_config_state()
assert config == {}
@pytest.mark.parametrize(
"associated_budget_table, expected_user_api_key_auth_key, expected_user_api_key_auth_value",
[