This website requires JavaScript.
Explore
Help
Sign in
phoenix-oss
/
llama-stack
Watch
1
Star
0
Fork
You've already forked llama-stack
0
forked from
phoenix-oss/llama-stack-mirror
Code
Pull requests
Releases
Packages
2
Activity
Actions
9a5803a429
llama-stack
/
llama_stack
/
providers
/
inline
/
inference
History
Download ZIP
Download TAR.GZ
Ashwin Bharambe
9a5803a429
move all implementations to use updated type
2025-01-13 23:16:53 -08:00
..
meta_reference
move all implementations to use updated type
2025-01-13 23:16:53 -08:00
sentence_transformers
remove conflicting default for tool prompt format in chat completion (
#742
)
2025-01-10 10:41:53 -08:00
vllm
Add inline vLLM inference provider to regression tests and fix regressions (
#662
)
2025-01-10 16:35:16 -08:00
__init__.py
precommit
2024-11-08 17:58:58 -08:00