LiteLLM Minor Fixes & Improvements (09/25/2024) (#5893)

* fix(langfuse.py): support new langfuse prompt_chat class init params

* fix(langfuse.py): handle new init values on prompt chat + prompt text templates

fixes error caused during langfuse logging

* docs(openai_compatible.md): clarify `openai/` handles correct routing for `/v1/completions` route

Fixes https://github.com/BerriAI/litellm/issues/5876

* fix(utils.py): handle unmapped gemini model optional param translation

Fixes https://github.com/BerriAI/litellm/issues/5888

* fix(o1_transformation.py): fix o-1 validation, to not raise error if temperature=1

Fixes https://github.com/BerriAI/litellm/issues/5884

* fix(prisma_client.py): refresh iam token

Fixes https://github.com/BerriAI/litellm/issues/5896

* fix: pass drop params where required

* fix(utils.py): pass drop_params correctly

* fix(types/vertex_ai.py): fix generation config

* test(test_max_completion_tokens.py): fix test

* fix(vertex_and_google_ai_studio_gemini.py): fix map openai params
This commit is contained in:
Krish Dholakia 2024-09-26 16:41:44 -07:00 committed by GitHub
parent ed5635e9a2
commit 0a03f2f11e
22 changed files with 755 additions and 292 deletions

View file

@ -29,7 +29,7 @@ def _is_base64(s):
return False
def str_to_bool(value: str) -> Optional[bool]:
def str_to_bool(value: Optional[str]) -> Optional[bool]:
"""
Converts a string to a boolean if it's a recognized boolean string.
Returns None if the string is not a recognized boolean value.
@ -37,6 +37,9 @@ def str_to_bool(value: str) -> Optional[bool]:
:param value: The string to be checked.
:return: True or False if the string is a recognized boolean, otherwise None.
"""
if value is None:
return None
true_values = {"true"}
false_values = {"false"}