llama-stack-mirror/llama_stack/models/llama/llama3
Hardik Shah 65ca85ba6b
fix: Updating ToolCall.arguments to allow for json strings that can be decoded on client side (#1685)
### What does this PR do?

Currently, `ToolCall.arguments` is a `Dict[str, RecursiveType]`.
However, on the client SDK side -- the `RecursiveType` gets deserialized
into a number ( both int and float get collapsed ) and hence when params
are `int` they get converted to float which might break client side
tools that might be doing type checking.

Closes: https://github.com/meta-llama/llama-stack/issues/1683

### Test Plan
Stainless changes --
https://github.com/meta-llama/llama-stack-client-python/pull/204
```
pytest -s -v --stack-config=fireworks tests/integration/agents/test_agents.py  --text-model meta-llama/Llama-3.1-8B-Instruct
```
2025-03-19 10:36:19 -07:00
..
prompt_templates refactor: move all datetime.now() calls to UTC (#1589) 2025-03-13 15:34:53 -07:00
__init__.py chore: remove dependency on llama_models completely (#1344) 2025-03-01 12:48:08 -08:00
chat_format.py fix: Updating ToolCall.arguments to allow for json strings that can be decoded on client side (#1685) 2025-03-19 10:36:19 -07:00
dog.jpg chore: move all Llama Stack types from llama-models to llama-stack (#1098) 2025-02-14 09:10:59 -08:00
interface.py chore: remove dependency on llama_models completely (#1344) 2025-03-01 12:48:08 -08:00
pasta.jpeg chore: move all Llama Stack types from llama-models to llama-stack (#1098) 2025-02-14 09:10:59 -08:00
template_data.py fix: Updating ToolCall.arguments to allow for json strings that can be decoded on client side (#1685) 2025-03-19 10:36:19 -07:00
tokenizer.model chore: remove dependency on llama_models completely (#1344) 2025-03-01 12:48:08 -08:00
tokenizer.py chore: remove dependency on llama_models completely (#1344) 2025-03-01 12:48:08 -08:00
tool_utils.py chore: remove dependency on llama_models completely (#1344) 2025-03-01 12:48:08 -08:00