mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-12-27 14:38:05 +00:00
This adds a `builtin::document_conversion` tool for converting documents when used with file_search that uses meta-llama/synthetic-data-kit. I also have another local implementation that uses Docling, but need to debug some segfault issues I'm hitting locally with that so pushing this first as a simpler reference implementation. Long-term I think we'll want a remote implemention here as well - like perhaps docling-serve or unstructured.io - but need to look more into that. This passes the existing `tests/verifications/openai_api/test_responses.py` but doesn't yet add any new tests for file types besides text and pdf. Signed-off-by: Ben Browning <bbrownin@redhat.com> |
||
|---|---|---|
| .. | ||
| bedrock.md | ||
| cerebras.md | ||
| dell-tgi.md | ||
| dell.md | ||
| fireworks.md | ||
| groq.md | ||
| meta-reference-gpu.md | ||
| nvidia.md | ||
| ollama.md | ||
| passthrough.md | ||
| remote-vllm.md | ||
| sambanova.md | ||
| tgi.md | ||
| together.md | ||