llama-stack-mirror/llama_stack/providers/inline
Ben Browning a5ebac18bd Add a helpful error message when Fails fails to load indexes
We've had a few cases recently where Faiss is unable to load its
indexes properly, like after users upgrade versions of Llama Stack or
numpy dependencies. This catches any exceptions when deserializing the
index with an error message about what likely went wrong and how to
potentially fix it so that users can attempt to fix this without
needing to raise an issue or being confused.

The most common case of new users hitting this will be with the
SqliteKVStoreImpl, since that's our local storage most people use when
testing out Llama Stack. So, I added a `__str__` method to it that
shows the proper path and table name that users need to potentially
delete to resolve this error.

Signed-off-by: Ben Browning <bbrownin@redhat.com>
2025-06-27 14:16:05 -04:00
..
agents chore: remove nested imports (#2515) 2025-06-26 08:01:05 +05:30
datasetio chore(refact): move paginate_records fn outside of datasetio (#2137) 2025-05-12 10:56:14 -07:00
eval chore: remove nested imports (#2515) 2025-06-26 08:01:05 +05:30
files/localfs refactor(env)!: enhanced environment variable substitution (#2490) 2025-06-26 08:20:08 +05:30
inference refactor(env)!: enhanced environment variable substitution (#2490) 2025-06-26 08:20:08 +05:30
ios/inference chore: removed executorch submodule (#1265) 2025-02-25 21:57:21 -08:00
post_training ci: add python package build test (#2457) 2025-06-19 18:57:32 +05:30
safety feat: add cpu/cuda config for prompt guard (#2194) 2025-05-28 12:23:15 -07:00
scoring refactor(env)!: enhanced environment variable substitution (#2490) 2025-06-26 08:20:08 +05:30
telemetry refactor(env)!: enhanced environment variable substitution (#2490) 2025-06-26 08:20:08 +05:30
tool_runtime feat: Add ChunkMetadata to Chunk (#2497) 2025-06-25 15:55:23 -04:00
vector_io Add a helpful error message when Fails fails to load indexes 2025-06-27 14:16:05 -04:00
__init__.py impls -> inline, adapters -> remote (#381) 2024-11-06 14:54:05 -08:00