llama-stack-mirror/llama_stack/providers/utils
Ashwin Bharambe e84d4436b5
Since we are pushing for HF repos, we should accept them in inference configs (#497)
# What does this PR do?

As the title says. 

## Test Plan

This needs
8752149f58
to also land. So the next package (0.0.54) will make this work properly.

The test is:

```bash
pytest -v -s -m "llama_3b and meta_reference" test_model_registration.py
```
2024-11-20 16:14:37 -08:00
..
bedrock add bedrock distribution code (#358) 2024-11-06 14:39:11 -08:00
datasetio [Evals API][11/n] huggingface dataset provider + mmlu scoring fn (#392) 2024-11-11 14:49:50 -05:00
inference Since we are pushing for HF repos, we should accept them in inference configs (#497) 2024-11-20 16:14:37 -08:00
kvstore Auto-generate distro yamls + docs (#468) 2024-11-18 14:57:06 -08:00
memory unregister for memory banks and remove update API (#458) 2024-11-14 17:12:11 -08:00
scoring fix tests after registration migration & rename meta-reference -> basic / llm_as_judge provider (#424) 2024-11-12 10:35:44 -05:00
telemetry telemetry WARNING->WARN fix 2024-10-21 18:52:48 -07:00
__init__.py API Updates (#73) 2024-09-17 19:51:35 -07:00