Since we are pushing for HF repos, we should accept them in inference configs (#497)

# What does this PR do?

As the title says. 

## Test Plan

This needs
8752149f58
to also land. So the next package (0.0.54) will make this work properly.

The test is:

```bash
pytest -v -s -m "llama_3b and meta_reference" test_model_registration.py
```
This commit is contained in:
Ashwin Bharambe 2024-11-20 16:14:37 -08:00 committed by GitHub
parent b3f9e8b2f2
commit e84d4436b5
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
5 changed files with 14 additions and 8 deletions

View file

@ -37,8 +37,10 @@ class MetaReferenceInferenceConfig(BaseModel):
@classmethod
def validate_model(cls, model: str) -> str:
permitted_models = supported_inference_models()
if model not in permitted_models:
model_list = "\n\t".join(permitted_models)
descriptors = [m.descriptor() for m in permitted_models]
repos = [m.huggingface_repo for m in permitted_models]
if model not in (descriptors + repos):
model_list = "\n\t".join(repos)
raise ValueError(
f"Unknown model: `{model}`. Choose from [\n\t{model_list}\n]"
)