mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-12-27 10:51:58 +00:00
This adds a `builtin::document_conversion` tool for converting documents when used with file_search that uses meta-llama/synthetic-data-kit. I also have another local implementation that uses Docling, but need to debug some segfault issues I'm hitting locally with that so pushing this first as a simpler reference implementation. Long-term I think we'll want a remote implemention here as well - like perhaps docling-serve or unstructured.io - but need to look more into that. This passes the existing `tests/verifications/openai_api/test_responses.py` but doesn't yet add any new tests for file types besides text and pdf. Signed-off-by: Ben Browning <bbrownin@redhat.com> |
||
|---|---|---|
| .. | ||
| k8s | ||
| ondevice_distro | ||
| remote_hosted_distro | ||
| self_hosted_distro | ||
| building_distro.md | ||
| configuration.md | ||
| importing_as_library.md | ||
| index.md | ||
| kubernetes_deployment.md | ||
| list_of_distributions.md | ||
| starting_llama_stack_server.md | ||