llama-stack-mirror/llama_stack/templates/remote-vllm
ehhuang b9fbfed216
chore(telemetry): remove service_name entirely (#1755)
# What does this PR do?


## Test Plan

LLAMA_STACK_CONFIG=dev pytest -s -v
tests/integration/agents/test_agents.py::test_custom_tool
--safety-shield meta-llama/Llama-Guard-3-8B --text-model
accounts/fireworks/models/llama-v3p1-8b-instruct

and verify trace in jaeger UI
https://llama-stack.readthedocs.io/en/latest/building_applications/telemetry.html#
2025-03-21 15:11:56 -07:00
..
__init__.py Auto-generate distro yamls + docs (#468) 2024-11-18 14:57:06 -08:00
build.yaml refactor(test): move tools, evals, datasetio, scoring and post training tests (#1401) 2025-03-04 14:53:47 -08:00
doc_template.md fix: Default to port 8321 everywhere (#1734) 2025-03-20 15:50:41 -07:00
run-with-safety.yaml chore(telemetry): remove service_name entirely (#1755) 2025-03-21 15:11:56 -07:00
run.yaml chore(telemetry): remove service_name entirely (#1755) 2025-03-21 15:11:56 -07:00
vllm.py test: Add default vLLM URL in remote-vllm template (#1736) 2025-03-21 07:31:59 -07:00