llama-stack/llama_stack/providers/utils/inference
Ihar Hrachyshka 66d6c2580e
chore: more mypy checks (ollama, vllm, ...) (#1777)
# What does this PR do?

- **chore: mypy for strong_typing**
- **chore: mypy for remote::vllm**
- **chore: mypy for remote::ollama**
- **chore: mypy for providers.datatype**

---------

Signed-off-by: Ihar Hrachyshka <ihar.hrachyshka@gmail.com>
2025-04-01 17:12:39 +02:00
..
__init__.py chore: move all Llama Stack types from llama-models to llama-stack (#1098) 2025-02-14 09:10:59 -08:00
embedding_mixin.py fix: dont assume SentenceTransformer is imported 2025-02-25 16:53:01 -08:00
litellm_openai_mixin.py fix: agents with non-llama model (#1550) 2025-03-17 22:11:06 -07:00
model_registry.py chore: more mypy checks (ollama, vllm, ...) (#1777) 2025-04-01 17:12:39 +02:00
openai_compat.py chore: more mypy checks (ollama, vllm, ...) (#1777) 2025-04-01 17:12:39 +02:00
prompt_adapter.py feat(logging): implement category-based logging (#1362) 2025-03-07 11:34:30 -08:00