litellm/litellm
Krish Dholakia a1d9e96b31
LiteLLM Minor Fixes & Improvements (09/25/2024) (#5893)
* fix(langfuse.py): support new langfuse prompt_chat class init params

* fix(langfuse.py): handle new init values on prompt chat + prompt text templates

fixes error caused during langfuse logging

* docs(openai_compatible.md): clarify `openai/` handles correct routing for `/v1/completions` route

Fixes https://github.com/BerriAI/litellm/issues/5876

* fix(utils.py): handle unmapped gemini model optional param translation

Fixes https://github.com/BerriAI/litellm/issues/5888

* fix(o1_transformation.py): fix o-1 validation, to not raise error if temperature=1

Fixes https://github.com/BerriAI/litellm/issues/5884

* fix(prisma_client.py): refresh iam token

Fixes https://github.com/BerriAI/litellm/issues/5896

* fix: pass drop params where required

* fix(utils.py): pass drop_params correctly

* fix(types/vertex_ai.py): fix generation config

* test(test_max_completion_tokens.py): fix test

* fix(vertex_and_google_ai_studio_gemini.py): fix map openai params
2024-09-26 16:41:44 -07:00
..
adapters fix(anthropic_adapter.py): fix sync streaming 2024-08-03 20:52:29 -07:00
assistants [Feat-Proxy] Add Azure Assistants API - Create Assistant, Delete Assistant Support (#5777) 2024-09-18 16:27:33 -07:00
batches use correct type hints for audio transcriptions 2024-09-05 09:12:27 -07:00
deprecated_litellm_server
files fix import error 2024-09-05 10:09:44 -07:00
fine_tuning refactor get_secret 2024-09-03 10:42:12 -07:00
integrations LiteLLM Minor Fixes & Improvements (09/25/2024) (#5893) 2024-09-26 16:41:44 -07:00
litellm_core_utils [Feat] Improve OTEL Tracking - Require all Redis Cache reads to be logged on OTEL (#5881) 2024-09-25 10:57:08 -07:00
llms LiteLLM Minor Fixes & Improvements (09/25/2024) (#5893) 2024-09-26 16:41:44 -07:00
proxy LiteLLM Minor Fixes & Improvements (09/25/2024) (#5893) 2024-09-26 16:41:44 -07:00
rerank_api LiteLLM Minor Fixes & Improvements (09/24/2024) (#5880) 2024-09-25 22:11:57 -07:00
router_strategy LiteLLM Minor Fixes & Improvements (09/21/2024) (#5819) 2024-09-21 18:51:53 -07:00
router_utils [Feat-Prometheus] Track exception status on litellm_deployment_failure_responses (#5706) 2024-09-14 18:44:31 -07:00
secret_managers LiteLLM Minor Fixes & Improvements (09/25/2024) (#5893) 2024-09-26 16:41:44 -07:00
tests LiteLLM Minor Fixes & Improvements (09/25/2024) (#5893) 2024-09-26 16:41:44 -07:00
types LiteLLM Minor Fixes & Improvements (09/25/2024) (#5893) 2024-09-26 16:41:44 -07:00
__init__.py LiteLLM Minor Fixes & Improvements (09/24/2024) (#5880) 2024-09-25 22:11:57 -07:00
_logging.py fix(_logging.py): fix timestamp format for json logs 2024-06-20 15:20:21 -07:00
_redis.py LiteLLM Minor Fixes & Improvements (09/21/2024) (#5819) 2024-09-21 18:51:53 -07:00
_service_logger.py [Fix] OTEL - Don't log messages when callback settings disable message logging (#5875) 2024-09-24 18:29:52 -07:00
_version.py
budget_manager.py feat(proxy_server.py): return litellm version in response headers 2024-05-08 16:00:08 -07:00
caching.py [Feat] Improve OTEL Tracking - Require all Redis Cache reads to be logged on OTEL (#5881) 2024-09-25 10:57:08 -07:00
cost.json
cost_calculator.py LiteLLM Minor Fixes & Improvements (09/24/2024) (#5880) 2024-09-25 22:11:57 -07:00
exceptions.py LiteLLM Minor Fixes & Improvements (09/21/2024) (#5819) 2024-09-21 18:51:53 -07:00
main.py LiteLLM Minor Fixes & Improvements (09/24/2024) (#5880) 2024-09-25 22:11:57 -07:00
model_prices_and_context_window_backup.json LiteLLM Minor Fixes & Improvements (09/24/2024) (#5880) 2024-09-25 22:11:57 -07:00
py.typed feature - Types for mypy - #360 2024-05-30 14:14:41 -04:00
requirements.txt
router.py [Feat-Router] Allow setting which environment to use a model on (#5892) 2024-09-25 10:12:06 -07:00
scheduler.py feat(scheduler.py): support redis caching for req. prioritization 2024-06-06 14:19:21 -07:00
timeout.py
utils.py LiteLLM Minor Fixes & Improvements (09/25/2024) (#5893) 2024-09-26 16:41:44 -07:00