mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-10-04 04:04:14 +00:00
Bedrock provider was returning None for both streaming and non-streaming inference, causing 'NoneType' object has no attribute 'choices' errors. Primary fix: Reorder inheritance to put mixin classes before protocol class in BedrockInferenceAdapter so actual implementations are called. Additional AWS Bedrock API compatibility fixes: - Fix non-streaming: use res["body"].read() instead of next(res["body"]) - Fix streaming: add proper event structure checks and safe access - Disable repetition_penalty (not supported by Bedrock Llama models) Fixes #3621 |
||
---|---|---|
.. | ||
inline | ||
registry | ||
remote | ||
utils | ||
__init__.py | ||
datatypes.py |