This website requires JavaScript.
Explore
Help
Sign in
phoenix-oss
/
llama-stack
Watch
1
Star
0
Fork
You've already forked llama-stack
0
forked from
phoenix-oss/llama-stack-mirror
Code
Pull requests
Releases
Packages
2
Activity
Actions
52a21ce78f
llama-stack
/
llama_stack
/
providers
/
remote
/
inference
History
Download ZIP
Download TAR.GZ
Ashwin Bharambe
d9d34433fc
Update spec
2025-01-13 23:16:53 -08:00
..
bedrock
remove conflicting default for tool prompt format in chat completion (
#742
)
2025-01-10 10:41:53 -08:00
cerebras
remove conflicting default for tool prompt format in chat completion (
#742
)
2025-01-10 10:41:53 -08:00
databricks
remove conflicting default for tool prompt format in chat completion (
#742
)
2025-01-10 10:41:53 -08:00
fireworks
[Fireworks] Update model name for Fireworks (
#753
)
2025-01-13 15:53:57 -08:00
groq
Update spec
2025-01-13 23:16:53 -08:00
nvidia
Update spec
2025-01-13 23:16:53 -08:00
ollama
remove conflicting default for tool prompt format in chat completion (
#742
)
2025-01-10 10:41:53 -08:00
sample
[remove import *] clean up import *'s (
#689
)
2024-12-27 15:45:44 -08:00
tgi
remove conflicting default for tool prompt format in chat completion (
#742
)
2025-01-10 10:41:53 -08:00
together
remove conflicting default for tool prompt format in chat completion (
#742
)
2025-01-10 10:41:53 -08:00
vllm
remove conflicting default for tool prompt format in chat completion (
#742
)
2025-01-10 10:41:53 -08:00
__init__.py
impls
->
inline
,
adapters
->
remote
(
#381
)
2024-11-06 14:54:05 -08:00