chore: update the groq inference impl to use openai-python for openai-compat functions

changes on api.groq.com -
- json_schema is now supported for specific models, see https://console.groq.com/docs/structured-outputs#supported-models
- response_format with streaming is now supported for models that support response_format
- groq no longer returns a 400 error if tools are provided and tool_choice is not "required"
This commit is contained in:
Matthew Farrellee 2025-09-06 08:08:08 -04:00
parent 47b640370e
commit 6911145263
2 changed files with 9 additions and 132 deletions

View file

@ -248,7 +248,7 @@ Available Models:
api=Api.inference,
adapter=AdapterSpec(
adapter_type="groq",
pip_packages=["litellm"],
pip_packages=["litellm", "openai"],
module="llama_stack.providers.remote.inference.groq",
config_class="llama_stack.providers.remote.inference.groq.GroqConfig",
provider_data_validator="llama_stack.providers.remote.inference.groq.config.GroqProviderDataValidator",