.. |
audio_utils
|
fix import error
|
2024-09-05 10:09:44 -07:00 |
llm_cost_calc
|
LiteLLM Minor Fixes & Improvements (10/09/2024) (#6139)
|
2024-10-10 00:42:11 -07:00 |
llm_response_utils
|
(fix) litellm.text_completion raises a non-blocking error on simple usage (#6546)
|
2024-11-04 15:47:48 -08:00 |
asyncify.py
|
build(config.yml): bump anyio version
|
2024-08-27 07:37:06 -07:00 |
core_helpers.py
|
Litellm dev 11 07 2024 (#6649)
|
2024-11-08 19:34:22 +05:30 |
default_encoding.py
|
Litellm dev 11 07 2024 (#6649)
|
2024-11-08 19:34:22 +05:30 |
exception_mapping_utils.py
|
[Feature]: Stop swallowing up AzureOpenAi exception responses in litellm's implementation for a BadRequestError (#6745)
|
2024-11-14 15:54:28 -08:00 |
get_llm_provider_logic.py
|
chore: comment for maritalk (#6607)
|
2024-11-07 12:20:12 -08:00 |
get_supported_openai_params.py
|
fix(utils.py): add logprobs support for together ai
|
2024-11-13 12:26:06 +05:30 |
json_validation_rule.py
|
feat(vertex_ai_anthropic.py): support response_schema for vertex ai anthropic calls
|
2024-07-18 16:57:38 -07:00 |
litellm_logging.py
|
Merge branch 'main' into litellm_dev_11_13_2024
|
2024-11-15 11:18:02 +05:30 |
llm_request_utils.py
|
Litellm ruff linting enforcement (#5992)
|
2024-10-01 19:44:20 -04:00 |
logging_utils.py
|
(refactor) use helper function _assemble_complete_response_from_streaming_chunks to assemble complete responses in caching and logging callbacks (#6220)
|
2024-10-15 12:45:12 +05:30 |
mock_functions.py
|
test(router_code_coverage.py): check if all router functions are dire… (#6186)
|
2024-10-14 22:44:00 -07:00 |
README.md
|
Litellm dev 11 07 2024 (#6649)
|
2024-11-08 19:34:22 +05:30 |
realtime_streaming.py
|
Litellm dev 10 22 2024 (#6384)
|
2024-10-22 21:18:54 -07:00 |
redact_messages.py
|
LiteLLM Minor Fixes & Improvements (10/04/2024) (#6064)
|
2024-10-04 21:28:53 -04:00 |
response_header_helpers.py
|
fix(utils.py): guarantee openai-compatible headers always exist in response
|
2024-09-28 21:08:15 -07:00 |
rules.py
|
Litellm dev 11 07 2024 (#6649)
|
2024-11-08 19:34:22 +05:30 |
streaming_chunk_builder_utils.py
|
LiteLLM Minor Fixes & Improvements (11/05/2024) (#6590)
|
2024-11-07 04:17:05 +05:30 |
streaming_handler.py
|
Litellm dev 11 07 2024 (#6649)
|
2024-11-08 19:34:22 +05:30 |
token_counter.py
|
fix(token_counter.py): New `get_modified_max_tokens' helper func
|
2024-06-27 15:38:09 -07:00 |