llama-stack-mirror/llama_stack/providers
Charlie Doern e2aefd797f fix: fix meta_reference telemetry console
currently, none of the inference metrics are printed when using the console.

This is because self._log_metric is only implemented if self.meter is not None.

acquire the lock, and add event to the span as the other `_log_...` method do

Signed-off-by: Charlie Doern <cdoern@redhat.com>
2025-08-06 15:48:41 -04:00
..
inline fix: fix meta_reference telemetry console 2025-08-06 15:48:41 -04:00
registry feat: Add openAI compatible APIs to Qdrant (#2465) 2025-08-01 00:41:34 -04:00
remote chore(misc): make tests and starter faster (#3042) 2025-08-05 14:55:05 -07:00
utils fix: actually propagate inference metrics 2025-08-06 15:48:41 -04:00
__init__.py API Updates (#73) 2024-09-17 19:51:35 -07:00
datatypes.py feat: create unregister shield API endpoint in Llama Stack (#2853) 2025-08-05 07:33:46 -07:00