llama-stack-mirror/llama_stack/providers/utils/inference
Ashwin Bharambe 2665f00102
chore(rename): move llama_stack.distribution to llama_stack.core (#2975)
We would like to rename the term `template` to `distribution`. To
prepare for that, this is a precursor.

cc @leseb
2025-07-30 23:30:53 -07:00
..
__init__.py chore: enable pyupgrade fixes (#1806) 2025-05-01 14:23:50 -07:00
embedding_mixin.py feat(registry): more flexible model lookup (#2859) 2025-07-22 15:22:48 -07:00
inference_store.py chore(rename): move llama_stack.distribution to llama_stack.core (#2975) 2025-07-30 23:30:53 -07:00
litellm_openai_mixin.py chore(rename): move llama_stack.distribution to llama_stack.core (#2975) 2025-07-30 23:30:53 -07:00
model_registry.py feat(starter)!: simplify starter distro; litellm model registry changes (#2916) 2025-07-25 15:02:04 -07:00
openai_compat.py chore: remove nested imports (#2515) 2025-06-26 08:01:05 +05:30
openai_mixin.py chore: create OpenAIMixin for inference providers with an OpenAI-compat API that need to implement openai_* methods (#2835) 2025-07-23 06:49:40 -04:00
prompt_adapter.py fix(ollama): Download remote image URLs for Ollama (#2551) 2025-06-30 20:36:11 +05:30
stream_utils.py feat: drop python 3.10 support (#2469) 2025-06-19 12:07:14 +05:30