llama-stack/llama_stack/providers/inline
Jeff Tang 82799a55bb
chore: removed executorch submodule (#1265)
# What does this PR do?
[Provide a short summary of what this PR does and why. Link to relevant
issues if applicable.]

to the llama-stack-client-swift repo - PR:
https://github.com/meta-llama/llama-stack-client-swift/pull/22

[//]: # (If resolving an issue, uncomment and update the line below)
[//]: # (Closes #[issue-number])

## Test Plan
[Describe the tests you ran to verify your changes with result
summaries. *Provide clear instructions so the plan can be easily
re-executed.*]

[//]: # (## Documentation)
2025-02-25 21:57:21 -08:00
..
agents fix: Raise exception when tool call result is None (#1253) 2025-02-25 13:10:50 -05:00
datasetio build: format codebase imports using ruff linter (#1028) 2025-02-13 10:06:21 -08:00
eval chore!: deprecate eval/tasks (#1186) 2025-02-20 14:06:21 -08:00
inference fix: resolve type hint issues and import dependencies (#1176) 2025-02-25 11:06:47 -08:00
ios/inference chore: removed executorch submodule (#1265) 2025-02-25 21:57:21 -08:00
post_training feat: Enable CPU training for torchtune (#1140) 2025-02-19 22:42:58 -08:00
safety chore: move all Llama Stack types from llama-models to llama-stack (#1098) 2025-02-14 09:10:59 -08:00
scoring feat: add aggregation_functions to llm_as_judge_405b_simpleqa (#1164) 2025-02-19 19:42:04 -08:00
telemetry build: format codebase imports using ruff linter (#1028) 2025-02-13 10:06:21 -08:00
tool_runtime fix: resolve type hint issues and import dependencies (#1176) 2025-02-25 11:06:47 -08:00
vector_io Fix sqlite_vec config defaults 2025-02-20 17:50:33 -08:00
__init__.py impls -> inline, adapters -> remote (#381) 2024-11-06 14:54:05 -08:00