llama-stack-mirror/llama_stack/providers/inline
Charlie Doern e258290213 fix: use logger for console telemetry
currently `print` is being used with custom formatting to achieve telemetry output in the console_span_processor

This causes telemetry not to show up in log files when using `LLAMA_STACK_LOG_FILE`. During testing it looks like telemetry is not being captures when it is

switch to using Rich formatting with the logger and then strip the formatting off when a log file is being uses so the formatting looks normal

Signed-off-by: Charlie Doern <cdoern@redhat.com>
2025-07-24 15:54:49 -04:00
..
agents fix(agent): ensure turns are sorted (#2854) 2025-07-22 10:24:51 -07:00
datasetio chore(refact): move paginate_records fn outside of datasetio (#2137) 2025-05-12 10:56:14 -07:00
eval chore: remove nested imports (#2515) 2025-06-26 08:01:05 +05:30
files/localfs feat: enable auth for LocalFS Files Provider (#2773) 2025-07-18 19:11:01 -07:00
inference feat(registry): make the Stack query providers for model listing (#2862) 2025-07-24 10:39:53 -07:00
ios/inference chore: removed executorch submodule (#1265) 2025-02-25 21:57:21 -08:00
post_training chore: add mypy post training (#2675) 2025-07-09 15:44:39 +02:00
safety ci: test safety with starter (#2628) 2025-07-09 16:53:50 +02:00
scoring fix: allow default empty vars for conditionals (#2570) 2025-07-01 14:42:05 +02:00
telemetry fix: use logger for console telemetry 2025-07-24 15:54:49 -04:00
tool_runtime feat: Add ChunkMetadata to Chunk (#2497) 2025-06-25 15:55:23 -04:00
vector_io chore: Added openai compatible vector io endpoints for chromadb (#2489) 2025-07-23 13:51:58 -07:00
__init__.py impls -> inline, adapters -> remote (#381) 2024-11-06 14:54:05 -08:00