litellm/litellm/litellm_core_utils
Ali Waleed 7ec414a3cf
Feat: Add Langtrace integration (#5341)
* Feat: Add Langtrace integration

* add langtrace service name

* fix timestamps for traces

* add tests

* Discard Callback + use existing otel logger

* cleanup

* remove print statments

* remove callback

* add docs

* docs

* add logging docs

* format logging

* remove emoji and add litellm proxy example

* format logging

* format `logging.md`

* add langtrace docs to logging.md

* sync conflict
2024-10-11 19:19:53 +05:30
..
audio_utils fix import error 2024-09-05 10:09:44 -07:00
llm_cost_calc LiteLLM Minor Fixes & Improvements (10/09/2024) (#6139) 2024-10-10 00:42:11 -07:00
asyncify.py build(config.yml): bump anyio version 2024-08-27 07:37:06 -07:00
core_helpers.py [Feat] Improve OTEL Tracking - Require all Redis Cache reads to be logged on OTEL (#5881) 2024-09-25 10:57:08 -07:00
exception_mapping_utils.py LiteLLM Minor Fixes & Improvements (10/04/2024) (#6064) 2024-10-04 21:28:53 -04:00
get_llm_provider_logic.py LiteLLM Minor Fixes & Improvements (10/07/2024) (#6101) 2024-10-07 22:17:22 -07:00
json_validation_rule.py feat(vertex_ai_anthropic.py): support response_schema for vertex ai anthropic calls 2024-07-18 16:57:38 -07:00
litellm_logging.py Feat: Add Langtrace integration (#5341) 2024-10-11 19:19:53 +05:30
llm_request_utils.py Litellm ruff linting enforcement (#5992) 2024-10-01 19:44:20 -04:00
logging_utils.py feat run aporia as post call success hook 2024-08-19 11:25:31 -07:00
redact_messages.py LiteLLM Minor Fixes & Improvements (10/04/2024) (#6064) 2024-10-04 21:28:53 -04:00
response_header_helpers.py fix(utils.py): guarantee openai-compatible headers always exist in response 2024-09-28 21:08:15 -07:00
streaming_utils.py fix(streaming_utils.py): fix generic_chunk_has_all_required_fields 2024-08-26 21:13:02 -07:00
token_counter.py fix(token_counter.py): New `get_modified_max_tokens' helper func 2024-06-27 15:38:09 -07:00