llama-stack-mirror/llama_stack/providers/remote
skamenan7 88e60c1bf6 fix: Bedrock provider returning None due to inheritance order
Bedrock provider was returning None for both streaming and non-streaming
inference, causing 'NoneType' object has no attribute 'choices' errors.

Primary fix: Reorder inheritance to put mixin classes before protocol class
in BedrockInferenceAdapter so actual implementations are called.

Additional AWS Bedrock API compatibility fixes:
- Fix non-streaming: use res["body"].read() instead of next(res["body"])
- Fix streaming: add proper event structure checks and safe access
- Disable repetition_penalty (not supported by Bedrock Llama models)

Fixes #3621
2025-09-30 16:52:23 -04:00
..
agents test: add unit test to ensure all config types are instantiable (#1601) 2025-03-12 22:29:58 -07:00
datasetio chore(misc): make tests and starter faster (#3042) 2025-08-05 14:55:05 -07:00
eval feat: add static embedding metadata to dynamic model listings for providers using OpenAIMixin (#3547) 2025-09-25 17:17:00 -04:00
files/s3 fix(expires_after): make sure multipart/form-data is properly parsed (#3612) 2025-09-30 16:14:03 -04:00
inference fix: Bedrock provider returning None due to inheritance order 2025-09-30 16:52:23 -04:00
post_training fix: remove inference.completion from docs (#3589) 2025-09-29 13:14:41 -07:00
safety refactor(logging): rename llama_stack logger categories (#3065) 2025-08-21 17:31:04 -07:00
tool_runtime chore(rename): move llama_stack.distribution to llama_stack.core (#2975) 2025-07-30 23:30:53 -07:00
vector_io feat: update qdrant hash function from SHA-1 to SHA-256 (#3477) 2025-09-17 15:10:10 -07:00
__init__.py impls -> inline, adapters -> remote (#381) 2024-11-06 14:54:05 -08:00