llama-stack/llama_stack/providers
Luis Tomas Bolivar 73c6f6126f
fix: Avoid unexpected keyword argument for sentence_transformers (#1269)
Now that remote-vllm include inline::sentence_transformers there is an
issue building the image:
Error building stack:
SentenceTransformersInferenceConfig.sample_run_config() got an
unexpected keyword argument '__distro_dir__'

To avoid that issue this fix extends the sample_run_config to accept
extra kwargs
2025-02-27 16:47:26 -08:00
..
inline fix: Avoid unexpected keyword argument for sentence_transformers (#1269) 2025-02-27 16:47:26 -08:00
registry fix: groq now depends on litellm 2025-02-27 14:07:12 -08:00
remote fix: register provider model name and HF alias in run.yaml (#1304) 2025-02-27 16:39:23 -08:00
tests feat(providers): Groq now uses LiteLLM openai-compat (#1303) 2025-02-27 13:16:50 -08:00
utils feat(providers): Groq now uses LiteLLM openai-compat (#1303) 2025-02-27 13:16:50 -08:00
__init__.py API Updates (#73) 2024-09-17 19:51:35 -07:00
datatypes.py chore: move all Llama Stack types from llama-models to llama-stack (#1098) 2025-02-14 09:10:59 -08:00