mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-12-03 09:53:45 +00:00
Now that remote-vllm include inline::sentence_transformers there is an issue building the image: Error building stack: SentenceTransformersInferenceConfig.sample_run_config() got an unexpected keyword argument '__distro_dir__' To avoid that issue this fix extends the sample_run_config to accept extra kwargs |
||
|---|---|---|
| .. | ||
| meta_reference | ||
| sentence_transformers | ||
| vllm | ||
| __init__.py | ||