llama-stack/llama_stack/providers/utils/inference
ehhuang 59dddafd12
feat: convert typehints from client_tool to litellm format (#1565)
Summary:
supports
https://github.com/meta-llama/llama-stack-client-python/pull/193

Test Plan:
LLAMA_STACK_CONFIG=fireworks pytest -s -v
tests/integration/agents/test_agents.py --safety-shield
meta-llama/Llama-Guard-3-8B --text-model
meta-llama/Llama-3.1-8B-Instruct
2025-03-11 20:02:11 -07:00
..
__init__.py
embedding_mixin.py
litellm_openai_mixin.py
model_registry.py
openai_compat.py feat: convert typehints from client_tool to litellm format (#1565) 2025-03-11 20:02:11 -07:00
prompt_adapter.py