llama-stack-mirror/llama_stack/providers/impls
2024-10-09 10:50:19 -04:00
..
ios/inference Add function for stopping inference (#224) 2024-10-09 10:50:19 -04:00
meta_reference [bugfix] Fix logprobs on meta-reference impl (#213) 2024-10-07 19:42:39 -07:00
vllm Inline vLLM inference provider (#181) 2024-10-05 23:34:16 -07:00
__init__.py API Updates (#73) 2024-09-17 19:51:35 -07:00