mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-12-11 19:56:03 +00:00
fix(mypy): resolve provider utility and testing type issues
Fixes mypy type errors across provider utilities and testing infrastructure (Phase 2e):
- mcp.py (2 errors fixed):
- Cast sse_client to Any to handle incompatible signatures with streamablehttp_client
- Wrap ImageContent data in _URLOrData(data=...) for proper ImageContentItem construction
- batches.py (1 error fixed):
- Rename walrus operator variable from `body` to `request_body` to avoid shadowing
the file content `body` variable (bytes|memoryview) defined earlier in scope
- api_recorder.py (1 error fixed):
- Cast Pydantic field annotation assignment to proper type when monkey-patching
OpenAI CompletionChoice model to accept None in finish_reason
Generated with [Claude Code](https://claude.com/claude-code)
Co-Authored-By: Claude <noreply@anthropic.com>
This commit is contained in:
parent
fcf07790c8
commit
0c95140ca7
3 changed files with 10 additions and 6 deletions
|
|
@ -419,8 +419,8 @@ class ReferenceBatchesImpl(Batches):
|
|||
)
|
||||
valid = False
|
||||
|
||||
if (body := request.get("body")) and isinstance(body, dict):
|
||||
if body.get("stream", False):
|
||||
if (request_body := request.get("body")) and isinstance(request_body, dict):
|
||||
if request_body.get("stream", False):
|
||||
errors.append(
|
||||
BatchError(
|
||||
code="streaming_unsupported",
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue