llama-stack/llama_stack
ehhuang 59dddafd12
feat: convert typehints from client_tool to litellm format (#1565)
Summary:
supports
https://github.com/meta-llama/llama-stack-client-python/pull/193

Test Plan:
LLAMA_STACK_CONFIG=fireworks pytest -s -v
tests/integration/agents/test_agents.py --safety-shield
meta-llama/Llama-Guard-3-8B --text-model
meta-llama/Llama-3.1-8B-Instruct
2025-03-11 20:02:11 -07:00
..
apis feat(api): list agents / sessions and get agent (#1410) 2025-03-11 10:33:46 -07:00
cli revert: feat(server): Use system packages for execution (#1551) 2025-03-11 09:58:25 -07:00
distribution fix: Multiple fixes for server shutdown (fix lifespan handling; fix handling CancelledError when raised by provider; let uvicorn handle signals) (#1495) 2025-03-11 10:30:55 -07:00
models/llama refactor: move a few tests to top-level tests/ directory 2025-03-03 17:33:39 -08:00
providers feat: convert typehints from client_tool to litellm format (#1565) 2025-03-11 20:02:11 -07:00
scripts refactor(test): introduce --stack-config and simplify options (#1404) 2025-03-05 17:02:02 -08:00
strong_typing Ensure that deprecations for fields follow through to OpenAPI 2025-02-19 13:54:04 -08:00
templates fix: remove Llama-3.2-1B-Instruct for fireworks (#1558) 2025-03-11 11:19:29 -07:00
__init__.py export LibraryClient 2024-12-13 12:08:00 -08:00
env.py refactor(test): move tools, evals, datasetio, scoring and post training tests (#1401) 2025-03-04 14:53:47 -08:00
log.py feat: add support for LLAMA_STACK_LOG_FILE (#1450) 2025-03-11 11:09:31 -07:00
schema_utils.py ci: add mypy for static type checking (#1101) 2025-02-21 13:15:40 -08:00