llama-stack-mirror/llama_stack/templates/meta-reference-gpu
ehhuang b9fbfed216
chore(telemetry): remove service_name entirely (#1755)
# What does this PR do?


## Test Plan

LLAMA_STACK_CONFIG=dev pytest -s -v
tests/integration/agents/test_agents.py::test_custom_tool
--safety-shield meta-llama/Llama-Guard-3-8B --text-model
accounts/fireworks/models/llama-v3p1-8b-instruct

and verify trace in jaeger UI
https://llama-stack.readthedocs.io/en/latest/building_applications/telemetry.html#
2025-03-21 15:11:56 -07:00
..
__init__.py Auto-generate distro yamls + docs (#468) 2024-11-18 14:57:06 -08:00
build.yaml Rename builtin::memory -> builtin::rag 2025-01-22 20:22:51 -08:00
doc_template.md fix: Default to port 8321 everywhere (#1734) 2025-03-20 15:50:41 -07:00
meta_reference.py fix: Default to port 8321 everywhere (#1734) 2025-03-20 15:50:41 -07:00
run-with-safety.yaml chore(telemetry): remove service_name entirely (#1755) 2025-03-21 15:11:56 -07:00
run.yaml chore(telemetry): remove service_name entirely (#1755) 2025-03-21 15:11:56 -07:00