This website requires JavaScript.
Explore
Help
Sign in
phoenix-oss
/
llama-stack
Watch
1
Star
0
Fork
You've already forked llama-stack
0
forked from
phoenix-oss/llama-stack-mirror
Code
Pull requests
Releases
Packages
2
Activity
Actions
ffedb81c11
llama-stack
/
llama_stack
/
providers
/
adapters
/
inference
/
vllm
History
Download ZIP
Download TAR.GZ
Ashwin Bharambe
bf4f97a2e1
Fix vLLM adapter chat_completion signature
2024-11-01 13:09:03 -07:00
..
__init__.py
Add vLLM inference provider for OpenAI compatible vLLM server (
#178
)
2024-10-20 18:43:25 -07:00
config.py
Add vLLM inference provider for OpenAI compatible vLLM server (
#178
)
2024-10-20 18:43:25 -07:00
vllm.py
Fix vLLM adapter chat_completion signature
2024-11-01 13:09:03 -07:00