llama-stack-mirror/llama_stack/providers/remote
Matthew Farrellee 3d119a86d4
chore: indicate to mypy that InferenceProvider.batch_completion/batch_chat_completion is concrete (#3239)
# What does this PR do?

closes https://github.com/llamastack/llama-stack/issues/3236

mypy considered our default implementations (raise NotImplementedError)
to be trivial. the result was we implemented the same stubs in
providers.

this change puts enough into the default impls so mypy considers them
non-trivial. this allows us to remove the duplicate implementations.
2025-08-22 14:17:30 -07:00
..
agents test: add unit test to ensure all config types are instantiable (#1601) 2025-03-12 22:29:58 -07:00
datasetio chore(misc): make tests and starter faster (#3042) 2025-08-05 14:55:05 -07:00
eval chore(rename): move llama_stack.distribution to llama_stack.core (#2975) 2025-07-30 23:30:53 -07:00
files/s3 feat: Add S3 Files Provider (#3202) 2025-08-22 10:38:59 -04:00
inference chore: indicate to mypy that InferenceProvider.batch_completion/batch_chat_completion is concrete (#3239) 2025-08-22 14:17:30 -07:00
post_training refactor(logging): rename llama_stack logger categories (#3065) 2025-08-21 17:31:04 -07:00
safety refactor(logging): rename llama_stack logger categories (#3065) 2025-08-21 17:31:04 -07:00
tool_runtime chore(rename): move llama_stack.distribution to llama_stack.core (#2975) 2025-07-30 23:30:53 -07:00
vector_io refactor(logging): rename llama_stack logger categories (#3065) 2025-08-21 17:31:04 -07:00
__init__.py impls -> inline, adapters -> remote (#381) 2024-11-06 14:54:05 -08:00