llama-stack-mirror/llama_stack/providers/remote/inference/ramalama
Daniel J Walsh c9a41288a3 feat: RamaLama Documentation and Templates
RamaLama is a fully Open Source AI Model tool that facilitate
local management of AI Models.

https://github.com/containers/ramalama

It is fully open source and supports pulling models from HuggingFace,
Ollama, OCI Images, and via URI file://, http://, https://

It uses the llama.cpp and vllm AI engines for running the MODELS.

It also defaults to running the models inside of containers.

Signed-off-by: Charlie Doern <cdoern@redhat.com>
2025-04-18 12:55:52 -04:00
..
__init__.py feat: remote ramalama provider implementation 2025-04-18 12:54:42 -04:00
config.py feat: remote ramalama provider implementation 2025-04-18 12:54:42 -04:00
models.py feat: RamaLama Documentation and Templates 2025-04-18 12:55:52 -04:00
ramalama.py feat: RamaLama Documentation and Templates 2025-04-18 12:55:52 -04:00