mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-12-17 19:12:36 +00:00
AWS switched to requiring region-prefixed inference profile IDs instead of foundation model IDs. Added auto-detection based on boto3 client region to convert model IDs like meta.llama3-1-70b-instruct-v1:0 to us.meta.llama3-1-70b-instruct-v1:0 depending on the region. Handles edge cases like ARNs, case sensitivity, and None regions. This fixes the ValidationException error when using on-demand throughput. |
||
|---|---|---|
| .. | ||
| agents | ||
| datasetio | ||
| eval | ||
| files/s3 | ||
| inference | ||
| post_training | ||
| safety | ||
| tool_runtime | ||
| vector_io | ||
| __init__.py | ||