* feat(customer_endpoints.py): support passing budget duration via `/customer/new` endpoint Closes https://github.com/BerriAI/litellm/issues/5651 * docs: add missing params to swagger + api documentation test * docs: add documentation for all key endpoints documents all params on swagger * docs(internal_user_endpoints.py): document all /user/new params Ensures all params are documented * docs(team_endpoints.py): add missing documentation for team endpoints Ensures 100% param documentation on swagger * docs(organization_endpoints.py): document all org params Adds documentation for all params in org endpoint * docs(customer_endpoints.py): add coverage for all params on /customer endpoints ensures all /customer/* params are documented * ci(config.yml): add endpoint doc testing to ci/cd * fix: fix internal_user_endpoints.py * fix(internal_user_endpoints.py): support 'duration' param * fix(partner_models/main.py): fix anthropic re-raise exception on vertex * fix: fix pydantic obj * build(model_prices_and_context_window.json): add new vertex claude model names vertex claude changed model names - causes cost tracking errors
2.9 KiB
import Image from '@theme/IdealImage'; import Tabs from '@theme/Tabs'; import TabItem from '@theme/TabItem';
OpenTelemetry - Tracing LLMs with any observability tool
OpenTelemetry is a CNCF standard for observability. It connects to any observability tool, such as Jaeger, Zipkin, Datadog, New Relic, Traceloop and others.
<Image img={require('../../img/traceloop_dash.png')} />
Getting Started
Install the OpenTelemetry SDK:
pip install opentelemetry-api opentelemetry-sdk opentelemetry-exporter-otlp
Set the environment variables (different providers may require different variables):
OTEL_EXPORTER="otlp_http"
OTEL_ENDPOINT="https://api.traceloop.com"
OTEL_HEADERS="Authorization=Bearer%20<your-api-key>"
OTEL_EXPORTER="otlp_http"
OTEL_ENDPOINT="http://0.0.0.0:4318"
OTEL_EXPORTER="otlp_grpc"
OTEL_ENDPOINT="http://0.0.0.0:4317"
OTEL_EXPORTER="otlp_grpc"
OTEL_ENDPOINT="https://api.lmnr.ai:8443"
OTEL_HEADERS="authorization=Bearer <project-api-key>"
Use just 1 line of code, to instantly log your LLM responses across all providers with OpenTelemetry:
litellm.callbacks = ["otel"]
Redacting Messages, Response Content from OpenTelemetry Logging
Redact Messages and Responses from all OpenTelemetry Logging
Set litellm.turn_off_message_logging=True
This will prevent the messages and responses from being logged to OpenTelemetry, but request metadata will still be logged.
Redact Messages and Responses from specific OpenTelemetry Logging
In the metadata typically passed for text completion or embedding calls you can set specific keys to mask the messages and responses for this call.
Setting mask_input
to True
will mask the input from being logged for this call
Setting mask_output
to True
will make the output from being logged for this call.
Be aware that if you are continuing an existing trace, and you set update_trace_keys
to include either input
or output
and you set the corresponding mask_input
or mask_output
, then that trace will have its existing input and/or output replaced with a redacted message.
Support
For any question or issue with the integration you can reach out to the OpenLLMetry maintainers on Slack or via email.
Troubleshooting
Trace LiteLLM Proxy user/key/org/team information on failed requests
LiteLLM emits the user_api_key_metadata
- key hash
- key_alias
- org_id
- user_id
- team_id
for successful + failed requests
click under litellm_request
in the trace
<Image img={require('../../img/otel_debug_trace.png')} />