llama-stack/llama_stack
yyymeta fb418813fc
fix: passthrough impl response.content.text (#1665)
# What does this PR do?
current passthrough impl returns chatcompletion_message.content as a
TextItem() , not a straight string. so it's not compatible with other
providers, and causes parsing error downstream.

change away from the generic pydantic conversion, and explicitly parse
out content.text

## Test Plan

setup llama server with passthrough

```
llama-stack-client eval run-benchmark "MMMU_Pro_standard"   --model-id    meta-llama/Llama-3-8B   --output-dir /tmp/   --num-examples 20
```
works without parsing error
2025-03-17 13:42:08 -07:00
..
apis fix: OpenAPI with provider get (#1627) 2025-03-13 19:56:32 -07:00
cli fix: Fix pre-commit check (#1628) 2025-03-13 18:57:42 -07:00
distribution feat: add support for logging config in the run.yaml (#1408) 2025-03-14 12:36:25 -07:00
models/llama refactor: move all datetime.now() calls to UTC (#1589) 2025-03-13 15:34:53 -07:00
providers fix: passthrough impl response.content.text (#1665) 2025-03-17 13:42:08 -07:00
scripts refactor(test): introduce --stack-config and simplify options (#1404) 2025-03-05 17:02:02 -08:00
strong_typing Ensure that deprecations for fields follow through to OpenAPI 2025-02-19 13:54:04 -08:00
templates feat: [new open benchmark] BFCL_v3 (#1578) 2025-03-14 12:50:49 -07:00
__init__.py export LibraryClient 2024-12-13 12:08:00 -08:00
env.py refactor(test): move tools, evals, datasetio, scoring and post training tests (#1401) 2025-03-04 14:53:47 -08:00
log.py feat: add support for logging config in the run.yaml (#1408) 2025-03-14 12:36:25 -07:00
schema_utils.py ci: add mypy for static type checking (#1101) 2025-02-21 13:15:40 -08:00