mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-07-27 06:28:50 +00:00
This commit addresses issue #2584 by: - Implementing lazy torch imports in llama4/chat_format.py and datatypes.py to prevent ModuleNotFoundError in torch-free environments. - Adding comprehensive unit tests to verify that text-only functionality works without torch and that vision features fail gracefully. - Ensuring the module remains importable and functional for text-based operations, thus resolving the 500 internal server errors. |
||
---|---|---|
.. | ||
llama | ||
test_llama4_import_torch_free.py | ||
test_prompt_adapter.py | ||
test_sku_resolve_alias.py | ||
test_system_prompts.py |