This website requires JavaScript.
Explore
Help
Sign in
phoenix-oss
/
llama-stack
Watch
1
Star
0
Fork
You've already forked llama-stack
0
forked from
phoenix-oss/llama-stack-mirror
Code
Pull requests
Releases
Packages
2
Activity
Actions
bf11cc0450
llama-stack
/
llama_stack
/
providers
/
utils
/
inference
History
Download ZIP
Download TAR.GZ
Sébastien Han
bf11cc0450
chore: update return type to Optional[str] (
#982
)
2025-02-11 22:10:28 -08:00
..
__init__.py
Fix precommit check after moving to ruff (
#927
)
2025-02-02 06:46:45 -08:00
embedding_mixin.py
Fix precommit check after moving to ruff (
#927
)
2025-02-02 06:46:45 -08:00
model_registry.py
chore: update return type to Optional[str] (
#982
)
2025-02-11 22:10:28 -08:00
openai_compat.py
perf: ensure ToolCall in ChatCompletionResponse is subset of ChatCompletionRequest.tools (
#1041
)
2025-02-11 18:31:35 -08:00
prompt_adapter.py
Support sys_prompt behavior in inference (
#937
)
2025-02-03 23:35:16 -08:00