Address comments

This commit is contained in:
Jiayi 2025-09-28 15:11:23 -07:00
parent 816b68fdc7
commit cf386ad8f8
2 changed files with 8 additions and 8 deletions

View file

@ -18,14 +18,14 @@ title: Batches
## Overview
The Batches API enables efficient processing of multiple requests in a single operation,
particularly useful for processing large datasets, batch evaluation workflows, and
cost-effective inference at scale.
particularly useful for processing large datasets, batch evaluation workflows, and
cost-effective inference at scale.
The API is designed to allow use of openai client libraries for seamless integration.
The API is designed to allow use of openai client libraries for seamless integration.
This API provides the following extensions:
This API provides the following extensions:
- idempotent batch creation
Note: This API is currently under active development and may undergo changes.
Note: This API is currently under active development and may undergo changes.
This section contains documentation for all available providers for the **batches** API.

View file

@ -146,7 +146,7 @@ class NVIDIAInferenceAdapter(OpenAIMixin, Inference):
# Convert query to text format
if isinstance(query, str):
query_text = query
elif hasattr(query, "text"):
elif isinstance(query, OpenAIChatCompletionContentPartTextParam):
query_text = query.text
else:
raise ValueError("Query must be a string or text content part")
@ -156,7 +156,7 @@ class NVIDIAInferenceAdapter(OpenAIMixin, Inference):
for item in items:
if isinstance(item, str):
passages.append({"text": item})
elif hasattr(item, "text"):
elif isinstance(item, OpenAIChatCompletionContentPartTextParam):
passages.append({"text": item.text})
else:
raise ValueError("Items must be strings or text content parts")