llama-stack-mirror/llama_stack/providers/remote
skamenan7 6634b21a76 Merge upstream/main and resolve conflicts
Resolved merge conflicts in:
- Documentation files: updated vector IO provider docs to include both kvstore fields and embedding model configuration
- Config files: merged kvstore requirements from upstream with embedding model fields
- Dependencies: updated to latest client versions while preserving llama-models dependency
- Regenerated lockfiles to ensure consistency

All embedding model configuration features preserved while incorporating upstream changes.
2025-07-16 19:57:02 -04:00
..
agents test: add unit test to ensure all config types are instantiable (#1601) 2025-03-12 22:29:58 -07:00
datasetio fix: allow default empty vars for conditionals (#2570) 2025-07-01 14:42:05 +02:00
eval refactor(env)!: enhanced environment variable substitution (#2490) 2025-06-26 08:20:08 +05:30
inference feat: create dynamic model registration for OpenAI and Llama compat remote inference providers (#2745) 2025-07-16 12:49:38 -04:00
post_training fix: allow default empty vars for conditionals (#2570) 2025-07-01 14:42:05 +02:00
safety fix: sambanova shields and model validation (#2693) 2025-07-11 16:29:15 -04:00
tool_runtime fix: allow default empty vars for conditionals (#2570) 2025-07-01 14:42:05 +02:00
vector_io Merge upstream/main and resolve conflicts 2025-07-16 19:57:02 -04:00
__init__.py impls -> inline, adapters -> remote (#381) 2024-11-06 14:54:05 -08:00