llama-stack-mirror/llama_stack/providers/adapters/inference
2024-09-27 11:41:37 -07:00
..
bedrock Bump version to 0.0.24 (#94) 2024-09-25 09:31:12 -07:00
fireworks Support for Llama3.2 models and Swift SDK (#98) 2024-09-25 10:29:58 -07:00
ollama Support for Llama3.2 models and Swift SDK (#98) 2024-09-25 10:29:58 -07:00
sample [API Updates] Model / shield / memory-bank routing + agent persistence + support for private headers (#92) 2024-09-23 14:22:22 -07:00
tgi Make TGI adapter compatible with HF Inference API (#97) 2024-09-25 14:08:31 -07:00
together fixing safety inference and safety adapter for new API spec. Pinned the llama_models version to 0.0.24 as the latest version 0.0.35 has the model descriptor name changed. I was getting the missing package error during runtime as well, hence added the dependency to requirements.txt 2024-09-27 11:41:37 -07:00
__init__.py API Updates (#73) 2024-09-17 19:51:35 -07:00