mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-10-04 04:04:14 +00:00
AWS switched to requiring region-prefixed inference profile IDs instead of foundation model IDs. Added auto-detection based on boto3 client region to convert model IDs like meta.llama3-1-70b-instruct-v1:0 to us.meta.llama3-1-70b-instruct-v1:0 depending on the region. Handles edge cases like ARNs, case sensitivity, and None regions. This fixes the ValidationException error when using on-demand throughput. |
||
---|---|---|
.. | ||
inline | ||
registry | ||
remote | ||
utils | ||
__init__.py | ||
datatypes.py |