llama-stack-mirror/llama_stack/providers
yyymeta fb418813fc
fix: passthrough impl response.content.text (#1665)
# What does this PR do?
current passthrough impl returns chatcompletion_message.content as a
TextItem() , not a straight string. so it's not compatible with other
providers, and causes parsing error downstream.

change away from the generic pydantic conversion, and explicitly parse
out content.text

## Test Plan

setup llama server with passthrough

```
llama-stack-client eval run-benchmark "MMMU_Pro_standard"   --model-id    meta-llama/Llama-3-8B   --output-dir /tmp/   --num-examples 20
```
works without parsing error
2025-03-17 13:42:08 -07:00
..
inline feat: [new open benchmark] BFCL_v3 (#1578) 2025-03-14 12:50:49 -07:00
registry feat: [new open benchmark] BFCL_v3 (#1578) 2025-03-14 12:50:49 -07:00
remote fix: passthrough impl response.content.text (#1665) 2025-03-17 13:42:08 -07:00
tests refactor(test): introduce --stack-config and simplify options (#1404) 2025-03-05 17:02:02 -08:00
utils feat: [new open benchmark] BFCL_v3 (#1578) 2025-03-14 12:50:49 -07:00
__init__.py API Updates (#73) 2024-09-17 19:51:35 -07:00
datatypes.py chore: move all Llama Stack types from llama-models to llama-stack (#1098) 2025-02-14 09:10:59 -08:00