llama-stack-mirror/tests/unit/models
skamenan7 f5c1935c18 fix: Resolve Llama4 tool calling 500 errors
This commit addresses issue #2584 by:
- Implementing lazy torch imports in llama4/chat_format.py and datatypes.py to prevent ModuleNotFoundError in torch-free environments.
- Adding comprehensive unit tests to verify that text-only functionality works without torch and that vision features fail gracefully.
- Ensuring the module remains importable and functional for text-based operations, thus resolving the 500 internal server errors.
2025-07-23 15:20:17 -04:00
..
llama chore: remove usage of load_tiktoken_bpe (#2276) 2025-06-02 07:33:37 -07:00
test_llama4_import_torch_free.py fix: Resolve Llama4 tool calling 500 errors 2025-07-23 15:20:17 -04:00
test_prompt_adapter.py fix: remove async test markers (fix pre-commit) (#2808) 2025-07-17 21:35:28 -07:00
test_sku_resolve_alias.py refactor: make sku_list resolve provider aliases generically 2025-07-23 15:20:17 -04:00
test_system_prompts.py chore(test): migrate unit tests from unittest to pytest for system prompt (#2789) 2025-07-18 11:54:02 +02:00