llama-stack/llama_stack/providers/remote/inference/vllm
Hardik Shah 65ca85ba6b
fix: Updating ToolCall.arguments to allow for json strings that can be decoded on client side (#1685)
### What does this PR do?

Currently, `ToolCall.arguments` is a `Dict[str, RecursiveType]`.
However, on the client SDK side -- the `RecursiveType` gets deserialized
into a number ( both int and float get collapsed ) and hence when params
are `int` they get converted to float which might break client side
tools that might be doing type checking.

Closes: https://github.com/meta-llama/llama-stack/issues/1683

### Test Plan
Stainless changes --
https://github.com/meta-llama/llama-stack-client-python/pull/204
```
pytest -s -v --stack-config=fireworks tests/integration/agents/test_agents.py  --text-model meta-llama/Llama-3.1-8B-Instruct
```
2025-03-19 10:36:19 -07:00
..
__init__.py Fix precommit check after moving to ruff (#927) 2025-02-02 06:46:45 -08:00
config.py fix: Add the option to not verify SSL at remote-vllm provider (#1585) 2025-03-18 09:33:35 -04:00
vllm.py fix: Updating ToolCall.arguments to allow for json strings that can be decoded on client side (#1685) 2025-03-19 10:36:19 -07:00