litellm-mirror/litellm/litellm_core_utils
Ishaan Jaff 5652c375b3
(feat) add XAI ChatCompletion Support (#6373)
* init commit for XAI

* add full logic for xai chat completion

* test_completion_xai

* docs xAI

* add xai/grok-beta

* test_xai_chat_config_get_openai_compatible_provider_info

* test_xai_chat_config_map_openai_params

* add xai streaming test
2024-11-01 20:37:09 +05:30
..
audio_utils fix import error 2024-09-05 10:09:44 -07:00
llm_cost_calc LiteLLM Minor Fixes & Improvements (10/09/2024) (#6139) 2024-10-10 00:42:11 -07:00
llm_response_utils feat(proxy_server.py): check if views exist on proxy server startup +… (#6360) 2024-10-24 22:02:15 -07:00
asyncify.py build(config.yml): bump anyio version 2024-08-27 07:37:06 -07:00
core_helpers.py Litellm dev 10 29 2024 (#6502) 2024-10-29 22:04:16 -07:00
exception_mapping_utils.py (code quality) add ruff check PLR0915 for too-many-statements (#6309) 2024-10-18 15:36:49 +05:30
get_llm_provider_logic.py (feat) add XAI ChatCompletion Support (#6373) 2024-11-01 20:37:09 +05:30
json_validation_rule.py feat(vertex_ai_anthropic.py): support response_schema for vertex ai anthropic calls 2024-07-18 16:57:38 -07:00
litellm_logging.py Litellm dev 10 29 2024 (#6502) 2024-10-29 22:04:16 -07:00
llm_request_utils.py Litellm ruff linting enforcement (#5992) 2024-10-01 19:44:20 -04:00
logging_utils.py (refactor) use helper function _assemble_complete_response_from_streaming_chunks to assemble complete responses in caching and logging callbacks (#6220) 2024-10-15 12:45:12 +05:30
mock_functions.py test(router_code_coverage.py): check if all router functions are dire… (#6186) 2024-10-14 22:44:00 -07:00
realtime_streaming.py Litellm dev 10 22 2024 (#6384) 2024-10-22 21:18:54 -07:00
redact_messages.py LiteLLM Minor Fixes & Improvements (10/04/2024) (#6064) 2024-10-04 21:28:53 -04:00
response_header_helpers.py fix(utils.py): guarantee openai-compatible headers always exist in response 2024-09-28 21:08:15 -07:00
streaming_chunk_builder_utils.py Litellm openai audio streaming (#6325) 2024-10-19 16:16:51 -07:00
streaming_utils.py fix(streaming_utils.py): fix generic_chunk_has_all_required_fields 2024-08-26 21:13:02 -07:00
token_counter.py fix(token_counter.py): New `get_modified_max_tokens' helper func 2024-06-27 15:38:09 -07:00