llama-stack-mirror/llama_stack
Charlie Doern e258290213 fix: use logger for console telemetry
currently `print` is being used with custom formatting to achieve telemetry output in the console_span_processor

This causes telemetry not to show up in log files when using `LLAMA_STACK_LOG_FILE`. During testing it looks like telemetry is not being captures when it is

switch to using Rich formatting with the logger and then strip the formatting off when a log file is being uses so the formatting looks normal

Signed-off-by: Charlie Doern <cdoern@redhat.com>
2025-07-24 15:54:49 -04:00
..
apis feat(registry): make the Stack query providers for model listing (#2862) 2025-07-24 10:39:53 -07:00
cli fix: honour deprecation of --config and --template (#2856) 2025-07-22 20:48:23 -07:00
distribution chore: return webmethod from find_matching_route (#2883) 2025-07-24 11:37:21 -07:00
models chore(api): add mypy coverage to chat_format (#2654) 2025-07-18 11:56:53 +02:00
providers fix: use logger for console telemetry 2025-07-24 15:54:49 -04:00
strong_typing chore: enable pyupgrade fixes (#1806) 2025-05-01 14:23:50 -07:00
templates fix: starter template and litellm backward compat conflict for openai (#2885) 2025-07-24 17:28:37 +02:00
ui fix: re-hydrate requirement and fix package (#2774) 2025-07-16 05:46:15 -04:00
__init__.py export LibraryClient 2024-12-13 12:08:00 -08:00
env.py refactor(test): move tools, evals, datasetio, scoring and post training tests (#1401) 2025-03-04 14:53:47 -08:00
log.py fix: use logger for console telemetry 2025-07-24 15:54:49 -04:00
schema_utils.py chore: enable pyupgrade fixes (#1806) 2025-05-01 14:23:50 -07:00