mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-07-18 02:42:31 +00:00
# What does this PR do? This PR adds two methods to the Inference API: - `batch_completion` - `batch_chat_completion` The motivation is for evaluations targeting a local inference engine (like meta-reference or vllm) where batch APIs provide for a substantial amount of acceleration. Why did I not add this to `Api.batch_inference` though? That just resulted in a _lot_ more book-keeping given the structure of Llama Stack. Had I done that, I would have needed to create a notion of a "batch model" resource, setup routing based on that, etc. This does not sound ideal. So what's the future of the batch inference API? I am not sure. Maybe we can keep it for true _asynchronous_ execution. So you can submit requests, and it can return a Job instance, etc. ## Test Plan Run meta-reference-gpu using: ```bash export INFERENCE_MODEL=meta-llama/Llama-4-Scout-17B-16E-Instruct export INFERENCE_CHECKPOINT_DIR=../checkpoints/Llama-4-Scout-17B-16E-Instruct-20250331210000 export MODEL_PARALLEL_SIZE=4 export MAX_BATCH_SIZE=32 export MAX_SEQ_LEN=6144 LLAMA_MODELS_DEBUG=1 llama stack run meta-reference-gpu ``` Then run the batch inference test case.
61 lines
1.7 KiB
JSON
61 lines
1.7 KiB
JSON
{
|
|
"sanity": {
|
|
"data": {
|
|
"content": "Complete the sentence using one word: Roses are red, violets are "
|
|
}
|
|
},
|
|
"non_streaming": {
|
|
"data": {
|
|
"content": "Micheael Jordan is born in ",
|
|
"expected": "1963"
|
|
}
|
|
},
|
|
"stop_sequence": {
|
|
"data": {
|
|
"content": "Return the exact same sentence and don't add additional words): Michael Jordan was born in the year of 1963"
|
|
}
|
|
},
|
|
"streaming": {
|
|
"data": {
|
|
"content": "Roses are red,"
|
|
}
|
|
},
|
|
"log_probs": {
|
|
"data": {
|
|
"content": "Complete the sentence: Micheael Jordan is born in "
|
|
}
|
|
},
|
|
"logprobs_non_streaming": {
|
|
"data": {
|
|
"content": "Micheael Jordan is born in "
|
|
}
|
|
},
|
|
"logprobs_streaming": {
|
|
"data": {
|
|
"content": "Roses are red,"
|
|
}
|
|
},
|
|
"structured_output": {
|
|
"data": {
|
|
"user_input": "Michael Jordan was born in 1963. He played basketball for the Chicago Bulls. He retired in 2003.",
|
|
"expected": {
|
|
"name": "Michael Jordan",
|
|
"year_born": "1963",
|
|
"year_retired": "2003"
|
|
}
|
|
}
|
|
},
|
|
"batch_completion": {
|
|
"data": {
|
|
"contents": [
|
|
"Micheael Jordan is born in ",
|
|
"Roses are red, violets are ",
|
|
"If you had a million dollars, what would you do with it? ",
|
|
"All you need is ",
|
|
"The capital of France is ",
|
|
"It is a good day to ",
|
|
"The answer to the universe is "
|
|
]
|
|
}
|
|
}
|
|
}
|