llama-stack/llama_stack/distribution
Sébastien Han 7cf1e24c4e
feat(logging): implement category-based logging (#1362)
# What does this PR do?

This commit introduces a new logging system that allows loggers to be
assigned
a category while retaining the logger name based on the file name. The
log
format includes both the logger name and the category, producing output
like:

```
INFO     2025-03-03 21:44:11,323 llama_stack.distribution.stack:103 [core]: Tool_groups: builtin::websearch served by
         tavily-search
```

Key features include:

- Category-based logging: Loggers can be assigned a category (e.g.,
  "core", "server") when programming. The logger can be loaded like
  this: `logger = get_logger(name=__name__, category="server")`
- Environment variable control: Log levels can be configured
per-category using the
  `LLAMA_STACK_LOGGING` environment variable. For example:
`LLAMA_STACK_LOGGING="server=DEBUG;core=debug"` enables DEBUG level for
the "server"
    and "core" categories.
- `LLAMA_STACK_LOGGING="all=debug"` sets DEBUG level globally for all
categories and
    third-party libraries.

This provides fine-grained control over logging levels while maintaining
a clean and
informative log format.

The formatter uses the rich library which provides nice colors better
stack traces like so:

```
ERROR    2025-03-03 21:49:37,124 asyncio:1758 [uncategorized]: unhandled exception during asyncio.run() shutdown
         task: <Task finished name='Task-16' coro=<handle_signal.<locals>.shutdown() done, defined at
         /Users/leseb/Documents/AI/llama-stack/llama_stack/distribution/server/server.py:146>
         exception=UnboundLocalError("local variable 'loop' referenced before assignment")>
         ╭────────────────────────────────────── Traceback (most recent call last) ───────────────────────────────────────╮
         │ /Users/leseb/Documents/AI/llama-stack/llama_stack/distribution/server/server.py:178 in shutdown                │
         │                                                                                                                │
         │   175 │   │   except asyncio.CancelledError:                                                                   │
         │   176 │   │   │   pass                                                                                         │
         │   177 │   │   finally:                                                                                         │
         │ ❱ 178 │   │   │   loop.stop()                                                                                  │
         │   179 │                                                                                                        │
         │   180 │   loop = asyncio.get_running_loop()                                                                    │
         │   181 │   loop.create_task(shutdown())                                                                         │
         ╰────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
         UnboundLocalError: local variable 'loop' referenced before assignment
```

Co-authored-by: Ashwin Bharambe <@ashwinb>
Signed-off-by: Sébastien Han <seb@redhat.com>

[//]: # (If resolving an issue, uncomment and update the line below)
[//]: # (Closes #[issue-number])

## Test Plan

```
python -m llama_stack.distribution.server.server --yaml-config ./llama_stack/templates/ollama/run.yaml
INFO     2025-03-03 21:55:35,918 __main__:365 [server]: Using config file: llama_stack/templates/ollama/run.yaml           
INFO     2025-03-03 21:55:35,925 __main__:378 [server]: Run configuration:                                                 
INFO     2025-03-03 21:55:35,928 __main__:380 [server]: apis:                                                              
         - agents                                                     
``` 
[//]: # (## Documentation)

---------

Signed-off-by: Sébastien Han <seb@redhat.com>
Co-authored-by: Ashwin Bharambe <ashwin.bharambe@gmail.com>
2025-03-07 11:34:30 -08:00
..
routers feat(logging): implement category-based logging (#1362) 2025-03-07 11:34:30 -08:00
server feat(logging): implement category-based logging (#1362) 2025-03-07 11:34:30 -08:00
store refactor: move a few tests to top-level tests/ directory 2025-03-03 17:33:39 -08:00
ui docs: update test_agents to use new Agent SDK API (#1402) 2025-03-06 15:21:12 -08:00
utils chore: remove unused build dir (#1379) 2025-03-05 15:40:00 -08:00
__init__.py API Updates (#73) 2024-09-17 19:51:35 -07:00
build.py build(container): misc improvements (#1291) 2025-02-28 10:01:52 -08:00
build_conda_env.sh chore: remove straggler references to llama-models (#1345) 2025-03-01 14:26:03 -08:00
build_container.sh chore: remove straggler references to llama-models (#1345) 2025-03-01 14:26:03 -08:00
build_venv.sh chore: remove straggler references to llama-models (#1345) 2025-03-01 14:26:03 -08:00
client.py chore: move all Llama Stack types from llama-models to llama-stack (#1098) 2025-02-14 09:10:59 -08:00
common.sh fix: Fixing some small issues with the build scripts (#1132) 2025-02-19 22:20:49 -08:00
configure.py fix: resolve pydantic warning on .dict() usage (#1445) 2025-03-06 11:27:47 -08:00
datatypes.py fix!: update eval-tasks -> benchmarks (#1032) 2025-02-13 16:40:58 -08:00
distribution.py chore(lint): update Ruff ignores for project conventions and maintainability (#1184) 2025-02-28 09:36:49 -08:00
inspect.py fix: improve signal handling and update dependencies (#1044) 2025-02-13 08:07:59 -08:00
library_client.py fix: raise error when request param failed to convert (#1339) 2025-03-01 10:39:05 -08:00
request_headers.py Add X-LlamaStack-Client-Version, rename ProviderData -> Provider-Data (#735) 2025-01-09 11:51:36 -08:00
resolver.py feat(logging): implement category-based logging (#1362) 2025-03-07 11:34:30 -08:00
stack.py feat(logging): implement category-based logging (#1362) 2025-03-07 11:34:30 -08:00
start_stack.sh feat(logging): implement category-based logging (#1362) 2025-03-07 11:34:30 -08:00