llama-stack/llama_stack/providers/remote/inference/groq
Yuan Tang f9ca441974
chore: Link to Groq docs in the warning message for preview model (#1060)
This should be `llama-3.2-3b` instead of `llama-3.2-3b-instruct`.
2025-02-13 12:14:57 -05:00
..
__init__.py [#432] Add Groq Provider - chat completions (#609) 2025-01-03 08:27:49 -08:00
config.py [#432] Add Groq Provider - chat completions (#609) 2025-01-03 08:27:49 -08:00
groq.py chore: Link to Groq docs in the warning message for preview model (#1060) 2025-02-13 12:14:57 -05:00
groq_utils.py feat: Support tool calling for streaming chat completion in remote vLLM provider (#1063) 2025-02-12 06:17:21 -08:00