llama-stack-mirror/tests/integration/inference/recordings/d3cdda901f5ae18d0d636419bd15fd4c5c45dfd6b0f35b0a708f10953874f047.json
Aakanksha Duggal b83184f7ef
feat(responses)!: Add web_search_2025_08_26 to the WebSearchToolTypes (#4103)
# What does this PR do?
Resolves #4102 

1. Added `web_search_2025_08_26` to the `WebSearchToolTypes` list and
the `OpenAIResponseInputToolWebSearch.type` Literal union
2. No changes needed to tool execution logic - all `web_search` types
map to the same underlying tool
3. Backward compatibility is maintained - existing `web_search`,
`web_search_preview`, and `web_search_preview_2025_03_11` types continue
to work
4. Added an integration test case using {"type":
"web_search_2025_08_26"} to verify it works correctly
5. Updated `docs/docs/providers/openai_responses_limitations.mdx` to
reflect that `web_search_2025_08_26` is now supported.
6. Removed incorrect references to `MOD1/MOD2/MOD3` (which don't exist
in the codebase)


<!-- If resolving an issue, uncomment and update the line below -->
<!-- Closes #[issue-number] -->

## Test Plan
<!-- Describe the tests you ran to verify your changes with result
summaries. *Provide clear instructions so the plan can be easily
re-executed.* -->

---------

Signed-off-by: Aakanksha Duggal <aduggal@redhat.com>
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2025-11-07 10:01:12 -08:00

62 lines
2.8 KiB
JSON
Generated

{
"test_id": "tests/integration/inference/test_openai_completion.py::test_openai_chat_completion_with_tool_choice_none[txt=ollama/llama3.2:3b-instruct-fp16-inference:chat_completion:tool_calling]",
"request": {
"method": "POST",
"url": "http://0.0.0.0:11434/v1/v1/chat/completions",
"headers": {},
"body": {
"model": "llama3.2:3b-instruct-fp16",
"messages": [
{
"role": "system",
"content": "Pretend you are a weather assistant."
},
{
"role": "user",
"content": "What's the weather like in San Francisco, CA?"
}
],
"stream": false
},
"endpoint": "/v1/chat/completions",
"model": "llama3.2:3b-instruct-fp16"
},
"response": {
"body": {
"__type__": "openai.types.chat.chat_completion.ChatCompletion",
"__data__": {
"id": "rec-d3cdda901f5a",
"choices": [
{
"finish_reason": "stop",
"index": 0,
"logprobs": null,
"message": {
"content": "As your weather assistant today, I've got the latest forecast for you.\n\nSan Francisco, California currently has a partially cloudy sky with a mix of sun and clouds. The temperature is mild, around 68 degrees Fahrenheit (20 degrees Celsius) with a gentle breeze blowing at about 5 miles per hour from the northwest.\n\nHowever, please note that San Francisco's weather is notoriously unpredictable due to its proximity to the Pacific Ocean, so conditions can change rapidly throughout the day.\n\nHere's what you might expect later today:\n\n* High pressure will prevail in the Bay Area, leading to mostly sunny skies with a few puffy clouds by mid-afternoon.\n* Overnight lows are expected to dip to around 55 degrees Fahrenheit (13 degrees Celsius), with a slight chance of scattered showers returning tomorrow morning.\n\nIf you're planning on visiting San Francisco or have outdoor plans today, I recommend checking the forecast again before heading out and packing accordingly!\n\nWould you like more detailed forecasts for specific times, days, or areas in San Francisco?",
"refusal": null,
"role": "assistant",
"annotations": null,
"audio": null,
"function_call": null,
"tool_calls": null
}
}
],
"created": 0,
"model": "llama3.2:3b-instruct-fp16",
"object": "chat.completion",
"service_tier": null,
"system_fingerprint": "fp_ollama",
"usage": {
"completion_tokens": 204,
"prompt_tokens": 45,
"total_tokens": 249,
"completion_tokens_details": null,
"prompt_tokens_details": null
}
}
},
"is_streaming": false
},
"id_normalization_mapping": {}
}