feat(tools)!: substantial clean up of "Tool" related datatypes (#3627)

This is a sweeping change to clean up some gunk around our "Tool"
definitions.

First, we had two types `Tool` and `ToolDef`. The first of these was a
"Resource" type for the registry but we had stopped registering tools
inside the Registry long back (and only registered ToolGroups.) The
latter was for specifying tools for the Agents API. This PR removes the
former and adds an optional `toolgroup_id` field to the latter.

Secondly, as pointed out by @bbrowning in
https://github.com/llamastack/llama-stack/pull/3003#issuecomment-3245270132,
we were doing a lossy conversion from a full JSON schema from the MCP
tool specification into our ToolDefinition to send it to the model.
There is no necessity to do this -- we ourselves aren't doing any
execution at all but merely passing it to the chat completions API which
supports this. By doing this (and by doing it poorly), we encountered
limitations like not supporting array items, or not resolving $refs,
etc.

To fix this, we replaced the `parameters` field by `{ input_schema,
output_schema }` which can be full blown JSON schemas.

Finally, there were some types in our llama-related chat format
conversion which needed some cleanup. We are taking this opportunity to
clean those up.

This PR is a substantial breaking change to the API. However, given our
window for introducing breaking changes, this suits us just fine. I will
be landing a concurrent `llama-stack-client` change as well since API
shapes are changing.
This commit is contained in:
Ashwin Bharambe 2025-10-02 15:12:03 -07:00 committed by Raghotham Murthy
parent d933d354e4
commit 2e544ecd8a
179 changed files with 34186 additions and 9171 deletions

View file

@ -24,14 +24,14 @@
"body": {
"__type__": "openai.types.chat.chat_completion.ChatCompletion",
"__data__": {
"id": "chatcmpl-123",
"id": "chatcmpl-528",
"choices": [
{
"finish_reason": "stop",
"index": 0,
"logprobs": null,
"message": {
"content": "Hello! As of my knowledge cutoff on December 15th, I have the latest information for you. However, please note that my data may not be entirely up-to-date.\n\nCurrently, and based on historical climate patterns, it appears to be a partly cloudy day with mild temperatures in San Francisco, CA. Expect a temperature range of around 48\u00b0F (9\u00b0C) to 54\u00b0F (12\u00b0C). It's likely to be a breezy day, with winds blowing at about 13 mph (21 km/h).\n\nHowever, if I were to look into more recent weather patterns or forecasts, I would recommend checking the latest conditions directly from reliable sources such as the National Weather Service or local news outlets for more accurate and up-to-date information.\n\nPlease let me know how I can further assist you.",
"content": "I can give you a general idea of the typical weather conditions in San Francisco during this time.\n\nUnfortunately, I'm not aware of your current location or date. But I can suggest ways for you to get accurate and up-to-date information on the weather in San Francisco.\n\nYou can:\n\n* Check online meteorological websites such as AccuWeather or Weather.com for current conditions and forecasts.\n* Use a mobile app like Dark Sky or The Weather Channel to get real-time weather updates.\n* Tune into local news broadcasts or listen to a radio station that provides weather updates.\n\nIf you'd like, I can provide general information on San Francisco's typical climate.",
"refusal": null,
"role": "assistant",
"annotations": null,
@ -41,15 +41,15 @@
}
}
],
"created": 1758978071,
"created": 1759376616,
"model": "llama3.2:3b-instruct-fp16",
"object": "chat.completion",
"service_tier": null,
"system_fingerprint": "fp_ollama",
"usage": {
"completion_tokens": 163,
"completion_tokens": 131,
"prompt_tokens": 45,
"total_tokens": 208,
"total_tokens": 176,
"completion_tokens_details": null,
"prompt_tokens_details": null
}