llama-stack-mirror/llama_stack/providers/remote/inference/nvidia
Mustafa Elbehery c3b2b06974
refactor(logging): rename llama_stack logger categories (#3065)
# What does this PR do?
<!-- Provide a short summary of what this PR does and why. Link to
relevant issues if applicable. -->
This PR renames categories of llama_stack loggers.

This PR aligns logging categories as per the package name, as well as
reviews from initial
https://github.com/meta-llama/llama-stack/pull/2868. This is a follow up
to #3061.

<!-- If resolving an issue, uncomment and update the line below -->
<!-- Closes #[issue-number] -->

Replaces https://github.com/meta-llama/llama-stack/pull/2868
Part of https://github.com/meta-llama/llama-stack/issues/2865

cc @leseb @rhuss

Signed-off-by: Mustafa Elbehery <melbeher@redhat.com>
2025-08-21 17:31:04 -07:00
..
__init__.py add NVIDIA NIM inference adapter (#355) 2024-11-23 15:59:00 -08:00
config.py fix: allow default empty vars for conditionals (#2570) 2025-07-01 14:42:05 +02:00
models.py ci: test safety with starter (#2628) 2025-07-09 16:53:50 +02:00
NVIDIA.md docs: update the docs for NVIDIA Inference provider (#3227) 2025-08-21 15:59:39 -07:00
nvidia.py refactor(logging): rename llama_stack logger categories (#3065) 2025-08-21 17:31:04 -07:00
openai_utils.py chore: enable pyupgrade fixes (#1806) 2025-05-01 14:23:50 -07:00
utils.py refactor(logging): rename llama_stack logger categories (#3065) 2025-08-21 17:31:04 -07:00