OpenAI /v1/realtime api support (#6047)

* feat(azure/realtime): initial working commit for proxy azure openai realtime endpoint support

Adds support for passing /v1/realtime calls via litellm proxy

* feat(realtime_api/main.py): abstraction for handling openai realtime api calls

* feat(router.py): add `arealtime()` endpoint in router for realtime api calls

Allows using `model_list` in proxy for realtime as well

* fix: make realtime api a private function

Structure might change based on feedback. Make that clear to users.

* build(requirements.txt): add websockets to the requirements.txt

* feat(openai/realtime): add openai /v1/realtime api support
This commit is contained in:
Krish Dholakia 2024-10-03 17:11:22 -04:00 committed by GitHub
parent 04b7490e0c
commit 66dc88c48d
11 changed files with 350 additions and 7 deletions

View file

@ -311,6 +311,8 @@ def get_llm_provider(
dynamic_api_key
)
)
if dynamic_api_key is None and api_key is not None:
dynamic_api_key = api_key
return model, custom_llm_provider, dynamic_api_key, api_base
elif model.split("/", 1)[0] in litellm.provider_list:
custom_llm_provider = model.split("/", 1)[0]