llama-stack/llama_stack/providers/inline/inference
Ashwin Bharambe 9f14382d82
meta reference inference fixes (#797)
Miscellaneous fixes for meta reference inference

Tests for log probs dont pass because meta reference does not support
top_k > 1
2025-01-16 18:17:46 -08:00
..
meta_reference meta reference inference fixes (#797) 2025-01-16 18:17:46 -08:00
sentence_transformers remove conflicting default for tool prompt format in chat completion (#742) 2025-01-10 10:41:53 -08:00
vllm Convert SamplingParams.strategy to a union (#767) 2025-01-15 05:38:51 -08:00
__init__.py precommit 2024-11-08 17:58:58 -08:00