llama-stack/llama_stack/providers/remote
yyymeta fb418813fc
fix: passthrough impl response.content.text (#1665)
# What does this PR do?
current passthrough impl returns chatcompletion_message.content as a
TextItem() , not a straight string. so it's not compatible with other
providers, and causes parsing error downstream.

change away from the generic pydantic conversion, and explicitly parse
out content.text

## Test Plan

setup llama server with passthrough

```
llama-stack-client eval run-benchmark "MMMU_Pro_standard"   --model-id    meta-llama/Llama-3-8B   --output-dir /tmp/   --num-examples 20
```
works without parsing error
2025-03-17 13:42:08 -07:00
..
agents test: add unit test to ensure all config types are instantiable (#1601) 2025-03-12 22:29:58 -07:00
datasetio test: add unit test to ensure all config types are instantiable (#1601) 2025-03-12 22:29:58 -07:00
inference fix: passthrough impl response.content.text (#1665) 2025-03-17 13:42:08 -07:00
safety test: add unit test to ensure all config types are instantiable (#1601) 2025-03-12 22:29:58 -07:00
tool_runtime test: add unit test to ensure all config types are instantiable (#1601) 2025-03-12 22:29:58 -07:00
vector_io test: add unit test to ensure all config types are instantiable (#1601) 2025-03-12 22:29:58 -07:00
__init__.py impls -> inline, adapters -> remote (#381) 2024-11-06 14:54:05 -08:00