llama-stack-mirror/llama_stack
Juanma e7eb9f9adc
fix: dataset metadata without provider_id (#2527)
# What does this PR do?
Fixes an error when inferring dataset provider_id with metadata

Closes #[2506](https://github.com/meta-llama/llama-stack/issues/2506)

Signed-off-by: Juanma Barea <juanmabareamartinez@gmail.com>
2025-06-27 08:51:29 -04:00
..
apis fix: finish conversion to StrEnum (#2514) 2025-06-26 08:01:26 +05:30
cli fix: stack build (#2485) 2025-06-20 15:15:43 -07:00
distribution fix: dataset metadata without provider_id (#2527) 2025-06-27 08:51:29 -04:00
models fix: finish conversion to StrEnum (#2514) 2025-06-26 08:01:26 +05:30
providers fix: Some missed env variable changes from PR 2490 (#2538) 2025-06-26 17:59:15 -07:00
strong_typing chore: enable pyupgrade fixes (#1806) 2025-05-01 14:23:50 -07:00
templates fix: Some missed env variable changes from PR 2490 (#2538) 2025-06-26 17:59:15 -07:00
ui fix(ui): ensure initial data fetch only happens once (#2486) 2025-06-24 12:22:55 +02:00
__init__.py export LibraryClient 2024-12-13 12:08:00 -08:00
env.py refactor(test): move tools, evals, datasetio, scoring and post training tests (#1401) 2025-03-04 14:53:47 -08:00
log.py chore: remove nested imports (#2515) 2025-06-26 08:01:05 +05:30
schema_utils.py chore: enable pyupgrade fixes (#1806) 2025-05-01 14:23:50 -07:00