llama-stack-mirror/src/llama_stack/models/llama
Ashwin Bharambe fcf07790c8
fix(mypy): resolve model implementation typing issues (#3934)
## Summary

Fixes mypy type errors across 4 model implementation files (Phase 2d of
mypy suppression removal plan):
- `src/llama_stack/models/llama/llama3/multimodal/image_transform.py`
(10 errors fixed)
- `src/llama_stack/models/llama/checkpoint.py` (2 errors fixed)
- `src/llama_stack/models/llama/hadamard_utils.py` (1 error fixed)
- `src/llama_stack/models/llama/llama3/multimodal/encoder_utils.py` (1
error fixed)

## Changes

### image_transform.py
- Fixed return type annotation for `find_supported_resolutions` from
`Tensor` to `list[tuple[int, int]]`
- Fixed parameter and return type annotations for
`resize_without_distortion` from `Tensor` to `Image.Image`
- Resolved variable shadowing by using separate names:
`possible_resolutions_list` for the list and
`possible_resolutions_tensor` for the tensor

### checkpoint.py
- Replaced deprecated `torch.BFloat16Tensor` and
`torch.cuda.BFloat16Tensor` with
`torch.set_default_dtype(torch.bfloat16)`
- Fixed variable shadowing by renaming numpy array to `ckpt_paths_array`
to distinguish from the parameter `ckpt_paths: list[Path]`

### hadamard_utils.py
- Added `isinstance` assertion to narrow type from `nn.Module` to
`nn.Linear` before accessing `in_features` attribute

### encoder_utils.py
- Fixed variable shadowing by using `masks_list` for list accumulation
and `masks` for the final Tensor result

## Test plan

- Verified all files pass mypy type checking (only optional dependency
import warnings remain)
- No functional changes - only type annotations and variable naming
improvements

Stacks on PR #3933

Co-authored-by: Claude <noreply@anthropic.com>
2025-10-28 10:28:29 -07:00
..
llama3 fix(mypy): resolve model implementation typing issues (#3934) 2025-10-28 10:28:29 -07:00
llama3_1 chore(package): migrate to src/ layout (#3920) 2025-10-27 12:02:21 -07:00
llama3_2 chore(package): migrate to src/ layout (#3920) 2025-10-27 12:02:21 -07:00
llama3_3 chore(package): migrate to src/ layout (#3920) 2025-10-27 12:02:21 -07:00
llama4 chore(package): migrate to src/ layout (#3920) 2025-10-27 12:02:21 -07:00
resources chore(package): migrate to src/ layout (#3920) 2025-10-27 12:02:21 -07:00
__init__.py chore(package): migrate to src/ layout (#3920) 2025-10-27 12:02:21 -07:00
checkpoint.py fix(mypy): resolve model implementation typing issues (#3934) 2025-10-28 10:28:29 -07:00
datatypes.py chore(package): migrate to src/ layout (#3920) 2025-10-27 12:02:21 -07:00
hadamard_utils.py fix(mypy): resolve model implementation typing issues (#3934) 2025-10-28 10:28:29 -07:00
prompt_format.py chore(package): migrate to src/ layout (#3920) 2025-10-27 12:02:21 -07:00
quantize_impls.py chore(package): migrate to src/ layout (#3920) 2025-10-27 12:02:21 -07:00
sku_list.py chore(package): migrate to src/ layout (#3920) 2025-10-27 12:02:21 -07:00
sku_types.py chore(package): migrate to src/ layout (#3920) 2025-10-27 12:02:21 -07:00
tokenizer_utils.py chore(package): migrate to src/ layout (#3920) 2025-10-27 12:02:21 -07:00