RHAIENG-565: purge the midstream repo content to only host the build artifacts, so only the redhat-distribution should remain

This commit is contained in:
Artemy 2025-08-12 12:54:32 +01:00
parent 9803329350
commit db484734b4
1126 changed files with 0 additions and 526647 deletions

View file

@ -1,77 +0,0 @@
name: 🐛 Bug Report
description: Create a report to help us reproduce and fix the bug
labels: ["bug"]
body:
- type: markdown
attributes:
value: >
#### Before submitting a bug, please make sure the issue hasn't been already addressed by searching through [the
existing and past issues](https://github.com/meta-llama/llama-stack/issues).
- type: textarea
id: system-info
attributes:
label: System Info
description: |
Please share your system info with us. You can use the following command to capture your environment information
python -m "torch.utils.collect_env"
placeholder: |
PyTorch version, CUDA version, GPU type, #num of GPUs...
validations:
required: true
- type: checkboxes
id: information-scripts-examples
attributes:
label: Information
description: 'The problem arises when using:'
options:
- label: "The official example scripts"
- label: "My own modified scripts"
- type: textarea
id: bug-description
attributes:
label: 🐛 Describe the bug
description: |
Please provide a clear and concise description of what the bug is.
Please also paste or describe the results you observe instead of the expected results.
placeholder: |
A clear and concise description of what the bug is.
```llama stack
# Command that you used for running the examples
```
Description of the results
validations:
required: true
- type: textarea
attributes:
label: Error logs
description: |
If you observe an error, please paste the error message including the **full** traceback of the exception. It may be relevant to wrap error messages in ```` ```triple quotes blocks``` ````.
placeholder: |
```
The error message you got, with the full traceback.
```
validations:
required: true
- type: textarea
id: expected-behavior
validations:
required: true
attributes:
label: Expected behavior
description: "A clear and concise description of what you would expect to happen."
- type: markdown
attributes:
value: >
Thanks for contributing 🎉!

View file

@ -1,12 +0,0 @@
blank_issues_enabled: false
contact_links:
- name: Have you read the docs?
url: https://llama-stack.readthedocs.io/en/latest/index.html
about: Much help can be found in the docs
- name: Start a discussion
url: https://github.com/meta-llama/llama-stack/discussions/new
about: Start a discussion on a topic
- name: Chat on Discord
url: https://discord.gg/llama-stack
about: Maybe chatting with the community can help

View file

@ -1,28 +0,0 @@
name: 🚀 Feature request
description: Request a new llama-stack feature
labels: ["enhancement"]
body:
- type: textarea
id: feature-pitch
attributes:
label: 🚀 Describe the new functionality needed
description: >
A clear and concise description of _what_ needs to be built.
validations:
required: true
- type: textarea
id: feature-motivation
attributes:
label: 💡 Why is this needed? What if we don't build it?
description: >
A clear and concise description of _why_ this functionality is needed.
validations:
required: true
- type: textarea
id: other-thoughts
attributes:
label: Other thoughts
description: >
Any thoughts about how this may result in complexity in the codebase, or other trade-offs.