llama-stack-mirror/llama_stack/providers/remote/inference/bedrock
2025-01-09 17:22:52 -08:00
..
__init__.py Split safety into (llama-guard, prompt-guard, code-scanner) (#400) 2024-11-11 09:29:18 -08:00
bedrock.py infer tool prompt format 2025-01-09 17:22:52 -08:00
config.py Update more distribution docs to be simpler and partially codegen'ed 2024-11-20 22:03:44 -08:00