feat(providers): sambanova updated to use LiteLLM openai-compat (#1596)

# What does this PR do?

switch sambanova inference adaptor to LiteLLM usage to simplify
integration and solve issues with current adaptor when streaming and
tool calling, models and templates updated

## Test Plan
pytest -s -v tests/integration/inference/test_text_inference.py
--stack-config=sambanova
--text-model=sambanova/Meta-Llama-3.3-70B-Instruct

pytest -s -v tests/integration/inference/test_vision_inference.py
--stack-config=sambanova
--vision-model=sambanova/Llama-3.2-11B-Vision-Instruct
This commit is contained in:
Jorge Piedrahita Ortiz 2025-05-06 18:50:22 -05:00 committed by GitHub
parent dd49ef31f1
commit b2b00a216b
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
15 changed files with 529 additions and 404 deletions

View file

@ -1,9 +1,10 @@
version: '2'
distribution_spec:
description: Use SambaNova.AI for running LLM inference
description: Use SambaNova for running LLM inference
providers:
inference:
- remote::sambanova
- inline::sentence-transformers
vector_io:
- inline::faiss
- remote::chromadb
@ -18,4 +19,6 @@ distribution_spec:
- remote::brave-search
- remote::tavily-search
- inline::rag-runtime
- remote::model-context-protocol
- remote::wolfram-alpha
image_type: conda