llama-stack-mirror/llama_stack
2025-06-10 09:46:08 +01:00
..
apis Add delete_openai_response route, define delete OpenAI message schema and make an integration test 2025-06-10 09:46:08 +01:00
cli fix: resolve template name to config path in llama stack run (#2361) 2025-06-03 14:39:12 -07:00
distribution feat: To add health status check for remote VLLM (#2303) 2025-06-06 15:33:12 -04:00
models chore: remove usage of load_tiktoken_bpe (#2276) 2025-06-02 07:33:37 -07:00
providers Add delete_openai_response route, define delete OpenAI message schema and make an integration test 2025-06-10 09:46:08 +01:00
strong_typing chore: enable pyupgrade fixes (#1806) 2025-05-01 14:23:50 -07:00
templates feat: add deps dynamically based on metastore config (#2405) 2025-06-05 14:07:25 -07:00
ui build: Bump version to 0.2.10 2025-06-05 22:56:39 +00:00
__init__.py export LibraryClient 2024-12-13 12:08:00 -08:00
env.py refactor(test): move tools, evals, datasetio, scoring and post training tests (#1401) 2025-03-04 14:53:47 -08:00
log.py chore: make cprint write to stderr (#2250) 2025-05-24 23:39:57 -07:00
schema_utils.py chore: enable pyupgrade fixes (#1806) 2025-05-01 14:23:50 -07:00