llama-stack-mirror/llama_stack/providers/utils/telemetry
ehhuang 0b5a794c27
fix: telemetry logger spams when queue is full (#3070)
# What does this PR do?


## Test Plan
Ran a stress test on chat completion endpoint locally:

For 10 concurrent users over 3 minutes:
Before:
<img width="1440" height="201" alt="image"
src="https://github.com/user-attachments/assets/24e0d580-186e-4e24-931e-2b936c5859b6"
/>

After:
<img width="1434" height="204" alt="image"
src="https://github.com/user-attachments/assets/4b806d88-f822-41e9-b25a-018cc4bec866"
/>

(Will send scripts in a future PR.)
2025-08-08 13:47:36 -07:00
..
__init__.py kill unnecessarily large imports from telemetry init 2024-12-08 16:57:16 -08:00
dataset_mixin.py chore: enable pyupgrade fixes (#1806) 2025-05-01 14:23:50 -07:00
sqlite_trace_store.py chore(test): fix flaky telemetry tests (#2815) 2025-07-22 12:30:14 -07:00
trace_protocol.py chore: update pre-commit hook versions (#2708) 2025-07-10 16:47:59 +02:00
tracing.py fix: telemetry logger spams when queue is full (#3070) 2025-08-08 13:47:36 -07:00