llama-stack-mirror/llama_stack/providers
Matthew Farrellee 6911145263 chore: update the groq inference impl to use openai-python for openai-compat functions
changes on api.groq.com -
- json_schema is now supported for specific models, see https://console.groq.com/docs/structured-outputs#supported-models
- response_format with streaming is now supported for models that support response_format
- groq no longer returns a 400 error if tools are provided and tool_choice is not "required"
2025-09-06 08:53:41 -04:00
..
inline feat(batches, completions): add /v1/completions support to /v1/batches (#3309) 2025-09-05 11:59:57 -07:00
registry chore: update the groq inference impl to use openai-python for openai-compat functions 2025-09-06 08:53:41 -04:00
remote chore: update the groq inference impl to use openai-python for openai-compat functions 2025-09-06 08:53:41 -04:00
utils fix: use lambda pattern for bedrock config env vars (#3307) 2025-09-05 10:45:11 +02:00
__init__.py API Updates (#73) 2024-09-17 19:51:35 -07:00
datatypes.py feat: create unregister shield API endpoint in Llama Stack (#2853) 2025-08-05 07:33:46 -07:00