llama-stack-mirror/llama_stack/providers/remote
Matthew Farrellee f6d1867bf5 chore: remove batch-related APIs
APIs removed:
 - POST /v1/batch-inference/completion
 - POST /v1/batch-inference/chat-completion
 - POST /v1/inference/batch-completion
 - POST /v1/inference/batch-chat-completion

note -
 - batch-completion & batch-chat-completion were only implemented for inference=inline::meta-reference
 - batch-inference were not implemented
2025-08-26 19:18:16 -04:00
..
agents test: add unit test to ensure all config types are instantiable (#1601) 2025-03-12 22:29:58 -07:00
datasetio chore(misc): make tests and starter faster (#3042) 2025-08-05 14:55:05 -07:00
eval chore(rename): move llama_stack.distribution to llama_stack.core (#2975) 2025-07-30 23:30:53 -07:00
files/s3 feat: Add S3 Files Provider (#3202) 2025-08-22 10:38:59 -04:00
inference chore: remove batch-related APIs 2025-08-26 19:18:16 -04:00
post_training refactor(logging): rename llama_stack logger categories (#3065) 2025-08-21 17:31:04 -07:00
safety refactor(logging): rename llama_stack logger categories (#3065) 2025-08-21 17:31:04 -07:00
tool_runtime chore(rename): move llama_stack.distribution to llama_stack.core (#2975) 2025-07-30 23:30:53 -07:00
vector_io refactor(logging): rename llama_stack logger categories (#3065) 2025-08-21 17:31:04 -07:00
__init__.py impls -> inline, adapters -> remote (#381) 2024-11-06 14:54:05 -08:00