llama-stack-mirror/llama_stack/providers/remote/inference/groq
Matthew Farrellee 6911145263 chore: update the groq inference impl to use openai-python for openai-compat functions
changes on api.groq.com -
- json_schema is now supported for specific models, see https://console.groq.com/docs/structured-outputs#supported-models
- response_format with streaming is now supported for models that support response_format
- groq no longer returns a 400 error if tools are provided and tool_choice is not "required"
2025-09-06 08:53:41 -04:00
..
__init__.py feat(providers): Groq now uses LiteLLM openai-compat (#1303) 2025-02-27 13:16:50 -08:00
config.py feat(starter)!: simplify starter distro; litellm model registry changes (#2916) 2025-07-25 15:02:04 -07:00
groq.py chore: update the groq inference impl to use openai-python for openai-compat functions 2025-09-06 08:53:41 -04:00
models.py feat(starter)!: simplify starter distro; litellm model registry changes (#2916) 2025-07-25 15:02:04 -07:00