mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-06-28 10:54:19 +00:00
# What does this PR do? Made code interpreter tool call to be async such that its non blocking ## Test Plan pytest -s -v tests/integration/agents/test_agents.py --stack-config=together --text-model=meta-llama/Llama-3.3-70B-Instruct <img width="1693" alt="image" src="https://github.com/user-attachments/assets/42520bb6-7acf-42d5-b71f-b35ca149d722" /> [//]: # (## Documentation) Co-authored-by: sarthakdeshpande <sarthak.deshpande@engati.com> |
||
---|---|---|
.. | ||
__init__.py | ||
code_env_prefix.py | ||
code_execution.py | ||
code_interpreter.py | ||
config.py | ||
matplotlib_custom_backend.py | ||
utils.py |