llama-stack/llama_stack/providers/remote/inference/bedrock
Ashwin Bharambe cdcbeb005b
chore: remove llama_models.llama3.api imports from providers (#1107)
There should be a choke-point for llama3.api imports -- this is the
prompt adapter. Creating a ChatFormat() object on demand is inexpensive.
The underlying Tokenizer is a singleton anyway.
2025-02-19 19:01:29 -08:00
..
__init__.py Split safety into (llama-guard, prompt-guard, code-scanner) (#400) 2024-11-11 09:29:18 -08:00
bedrock.py chore: remove llama_models.llama3.api imports from providers (#1107) 2025-02-19 19:01:29 -08:00
config.py Update more distribution docs to be simpler and partially codegen'ed 2024-11-20 22:03:44 -08:00
models.py fix: Get distro_codegen.py working with default deps and enabled in pre-commit hooks (#1123) 2025-02-19 18:39:20 -08:00