llama-stack-mirror/llama_stack/core/routers
slekkala1 935b8e28de
fix: Fireworks chat completion broken due to telemetry (#3392)
# What does this PR do?
Fix fireworks chat completion broken due to telemetry expecting
response.usage
 Closes https://github.com/llamastack/llama-stack/issues/3391

## Test Plan
1. `uv run --with llama-stack llama stack build --distro starter
--image-type venv --run`
Try 

```
curl -X POST http://0.0.0.0:8321/v1/openai/v1/chat/completions \
    -H "Content-Type: application/json" \
    -d '{
      "model": "fireworks/accounts/fireworks/models/llama-v3p1-8b-instruct",
      "messages": [{"role": "user", "content": "Hello!"}]
    }'
```
```
{"id":"chatcmpl-ee922a08-0df0-4974-b0d3-b322113e8bc0","choices":[{"message":{"role":"assistant","content":"Hello! How can I assist you today?","name":null,"tool_calls":null},"finish_reason":"stop","index":0,"logprobs":null}],"object":"chat.completion","created":1757456375,"model":"fireworks/accounts/fireworks/models/llama-v3p1-8b-instruct"}%   
```

Without fix fails as mentioned in
https://github.com/llamastack/llama-stack/issues/3391

Co-authored-by: Francisco Arceo <arceofrancisco@gmail.com>
2025-09-10 08:48:01 -07:00
..
__init__.py chore(rename): move llama_stack.distribution to llama_stack.core (#2975) 2025-07-30 23:30:53 -07:00
datasets.py refactor(logging): rename llama_stack logger categories (#3065) 2025-08-21 17:31:04 -07:00
eval_scoring.py refactor(logging): rename llama_stack logger categories (#3065) 2025-08-21 17:31:04 -07:00
inference.py fix: Fireworks chat completion broken due to telemetry (#3392) 2025-09-10 08:48:01 -07:00
safety.py refactor(logging): rename llama_stack logger categories (#3065) 2025-08-21 17:31:04 -07:00
tool_runtime.py refactor(logging): rename llama_stack logger categories (#3065) 2025-08-21 17:31:04 -07:00
vector_io.py refactor(logging): rename llama_stack logger categories (#3065) 2025-08-21 17:31:04 -07:00