llama-stack/llama_stack/providers
Jeff Tang 82799a55bb
chore: removed executorch submodule (#1265)
# What does this PR do?
[Provide a short summary of what this PR does and why. Link to relevant
issues if applicable.]

to the llama-stack-client-swift repo - PR:
https://github.com/meta-llama/llama-stack-client-swift/pull/22

[//]: # (If resolving an issue, uncomment and update the line below)
[//]: # (Closes #[issue-number])

## Test Plan
[Describe the tests you ran to verify your changes with result
summaries. *Provide clear instructions so the plan can be easily
re-executed.*]

[//]: # (## Documentation)
2025-02-25 21:57:21 -08:00
..
inline chore: removed executorch submodule (#1265) 2025-02-25 21:57:21 -08:00
registry chore: move embedding deps to RAG tool where they are needed (#1210) 2025-02-21 11:33:41 -08:00
remote feat: Add Groq distribution template (#1173) 2025-02-25 14:16:56 -08:00
tests feat: completing text /chat-completion and /completion tests (#1223) 2025-02-25 11:37:04 -08:00
utils fix: dont assume SentenceTransformer is imported 2025-02-25 16:53:01 -08:00
__init__.py API Updates (#73) 2024-09-17 19:51:35 -07:00
datatypes.py chore: move all Llama Stack types from llama-models to llama-stack (#1098) 2025-02-14 09:10:59 -08:00