mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-07-23 04:53:14 +00:00
Now that remote-vllm include inline::sentence_transformers there is an issue building the image: Error building stack: SentenceTransformersInferenceConfig.sample_run_config() got an unexpected keyword argument '__distro_dir__' To avoid that issue this fix extends the sample_run_config to accept extra kwargs |
||
---|---|---|
.. | ||
inline | ||
registry | ||
remote | ||
tests | ||
utils | ||
__init__.py | ||
datatypes.py |