llama-stack-mirror/llama_stack/models/llama
Yuan Tang a1da09e166
feat: Support "stop" parameter in remote:vLLM
Signed-off-by: Yuan Tang <terrytangyuan@gmail.com>
2025-03-19 22:41:34 -04:00
..
llama3 fix: Updating ToolCall.arguments to allow for json strings that can be decoded on client side (#1685) 2025-03-19 10:36:19 -07:00
llama3_1 chore: remove straggler references to llama-models (#1345) 2025-03-01 14:26:03 -08:00
llama3_2 chore: remove straggler references to llama-models (#1345) 2025-03-01 14:26:03 -08:00
llama3_3 chore: remove dependency on llama_models completely (#1344) 2025-03-01 12:48:08 -08:00
datatypes.py feat: Support "stop" parameter in remote:vLLM 2025-03-19 22:41:34 -04:00
prompt_format.py chore: remove dependency on llama_models completely (#1344) 2025-03-01 12:48:08 -08:00
sku_list.py chore: move all Llama Stack types from llama-models to llama-stack (#1098) 2025-02-14 09:10:59 -08:00