llama-stack-mirror/llama_stack/providers/inline/inference
Ihar Hrachyshka 66d6c2580e
chore: more mypy checks (ollama, vllm, ...) (#1777)
# What does this PR do?

- **chore: mypy for strong_typing**
- **chore: mypy for remote::vllm**
- **chore: mypy for remote::ollama**
- **chore: mypy for providers.datatype**

---------

Signed-off-by: Ihar Hrachyshka <ihar.hrachyshka@gmail.com>
2025-04-01 17:12:39 +02:00
..
meta_reference fix: avoid tensor memory error (#1688) 2025-03-18 16:17:29 -07:00
sentence_transformers chore: more mypy checks (ollama, vllm, ...) (#1777) 2025-04-01 17:12:39 +02:00
vllm fix: Updating ToolCall.arguments to allow for json strings that can be decoded on client side (#1685) 2025-03-19 10:36:19 -07:00
__init__.py precommit 2024-11-08 17:58:58 -08:00