llama-stack-mirror/llama_stack
ehhuang e58c7f6c37
fix(telemetry): root span not yet received (#1828)
# What does this PR do?
closes #1725 

In https://github.com/meta-llama/llama-stack/pull/1759's attempt to make
trace_id consistent in llama stack and otel exports, it incorrectly sets
the span_id in context, which causes the root span to have a parent ID,
leading to the issue in #1725.

This PR reverts #1759's change to set the parent context. We will need
to follow up with a proper way to do this.

## Test Plan
<img width="1868" alt="image"
src="https://github.com/user-attachments/assets/15e9ac18-8541-461d-b261-c4e124388cc3"
/>
2025-03-28 14:40:17 -07:00
..
apis feat(api): don't return a payload on file delete (#1640) 2025-03-25 17:12:36 -07:00
cli fix: Use CONDA_DEFAULT_ENV presence as a flag to use conda mode (#1555) 2025-03-27 17:13:22 -04:00
distribution fix: Adding chunk_size_in_tokens to playground rag_tool insert (#1826) 2025-03-28 15:56:25 -04:00
models/llama feat: Support "stop" parameter in remote:vLLM (#1715) 2025-03-24 12:42:55 -07:00
providers fix(telemetry): root span not yet received (#1828) 2025-03-28 14:40:17 -07:00
strong_typing fix: Support types.UnionType in schemas (#1721) 2025-03-20 09:54:02 -07:00
templates docs: fix remote-vllm instructions (#1805) 2025-03-27 10:19:51 -04:00
__init__.py export LibraryClient 2024-12-13 12:08:00 -08:00
env.py refactor(test): move tools, evals, datasetio, scoring and post training tests (#1401) 2025-03-04 14:53:47 -08:00
log.py chore: Remove style tags from log formatter (#1808) 2025-03-27 10:18:21 -04:00
schema_utils.py chore: make mypy happy with webmethod (#1758) 2025-03-22 08:17:23 -07:00