llama-stack-mirror/llama_stack
Vladislav Bronzov 09299e908e
Add windows support for build execution (#889)
# What does this PR do?

This PR implements windows platform support for build_container.sh
execution from terminal. Additionally, it resolves "no support for
Terminos and PTY for Window PC" issues.

- [x] Addresses issue (#issue)
Releates issues: https://github.com/meta-llama/llama-stack/issues/826,
https://github.com/meta-llama/llama-stack/issues/726

## Test Plan

Changes were tested manually by executing standard scripts from LLama
guide:
- llama stack build --template ollama --image-type container
- llama stack build --list-templates
- llama stack build

## Sources

Please link relevant resources if necessary.


## Before submitting

- [ ] This PR fixes a typo or improves the docs (you can dismiss the
other checks if that's the case).
- [x] Ran pre-commit to handle lint / formatting issues.
- [x] Read the [contributor
guideline](https://github.com/meta-llama/llama-stack/blob/main/CONTRIBUTING.md),
      Pull Request section?
- [ ] Updated relevant documentation.
- [ ] Wrote necessary unit or integration tests.
2025-01-28 07:41:41 -08:00
..
apis Agent response format (#660) 2025-01-28 05:05:38 -08:00
cli Ensure llama stack build --config <> --image-type <> works (#879) 2025-01-25 11:13:36 -08:00
distribution Add windows support for build execution (#889) 2025-01-28 07:41:41 -08:00
providers Agent response format (#660) 2025-01-28 05:05:38 -08:00
scripts [memory refactor][3/n] Introduce RAGToolRuntime as a specialized sub-protocol (#832) 2025-01-22 10:04:16 -08:00
templates Report generation minor fixes (#884) 2025-01-28 04:58:12 -08:00
__init__.py export LibraryClient 2024-12-13 12:08:00 -08:00