llama-stack/llama_stack/providers/remote/inference/bedrock
2024-12-18 06:30:02 -08:00
..
__init__.py Split safety into (llama-guard, prompt-guard, code-scanner) (#400) 2024-11-11 09:29:18 -08:00
bedrock.py Dont include 3B / 1B models for bedrock since they arent ondemand 2024-12-18 06:30:02 -08:00
config.py Update more distribution docs to be simpler and partially codegen'ed 2024-11-20 22:03:44 -08:00