llama-stack-mirror/llama_stack/providers
Charlie Doern e258290213 fix: use logger for console telemetry
currently `print` is being used with custom formatting to achieve telemetry output in the console_span_processor

This causes telemetry not to show up in log files when using `LLAMA_STACK_LOG_FILE`. During testing it looks like telemetry is not being captures when it is

switch to using Rich formatting with the logger and then strip the formatting off when a log file is being uses so the formatting looks normal

Signed-off-by: Charlie Doern <cdoern@redhat.com>
2025-07-24 15:54:49 -04:00
..
inline fix: use logger for console telemetry 2025-07-24 15:54:49 -04:00
registry chore: remove *_openai_compat providers (#2849) 2025-07-22 10:25:36 -07:00
remote feat(registry): make the Stack query providers for model listing (#2862) 2025-07-24 10:39:53 -07:00
utils feat(registry): make the Stack query providers for model listing (#2862) 2025-07-24 10:39:53 -07:00
__init__.py API Updates (#73) 2024-09-17 19:51:35 -07:00
datatypes.py feat(registry): make the Stack query providers for model listing (#2862) 2025-07-24 10:39:53 -07:00