mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-12-17 15:42:37 +00:00
docs(ai): disclose AI assistance
This commit is contained in:
parent
4434fcc2c3
commit
b30390dc88
1 changed files with 27 additions and 0 deletions
|
|
@ -2,6 +2,33 @@
|
||||||
We want to make contributing to this project as easy and transparent as
|
We want to make contributing to this project as easy and transparent as
|
||||||
possible.
|
possible.
|
||||||
|
|
||||||
|
## AI Assistance Notice
|
||||||
|
|
||||||
|
> [!IMPORTANT]
|
||||||
|
>
|
||||||
|
> If you are using **any kind of AI assistance** to contribute to Llama Stack,
|
||||||
|
> it must be disclosed in the pull request.
|
||||||
|
|
||||||
|
If you are using any kind of AI assistance while contributing to Llama Stack,
|
||||||
|
**this must be disclosed in the pull request**, along with the extent to
|
||||||
|
which AI assistance was used (e.g. docs only vs. code generation).
|
||||||
|
If PR responses are being generated by an AI, disclose that as well.
|
||||||
|
|
||||||
|
An example disclosure:
|
||||||
|
|
||||||
|
> This PR was written primarily by Claude Code.
|
||||||
|
|
||||||
|
Or a more detailed disclosure:
|
||||||
|
|
||||||
|
> I consulted ChatGPT to understand the codebase but the solution
|
||||||
|
> was fully authored manually by myself.
|
||||||
|
|
||||||
|
Failure to disclose this is first and foremost rude to the human operators
|
||||||
|
on the other end of the pull request, but it also makes it difficult to
|
||||||
|
determine how much scrutiny to apply to the contribution.
|
||||||
|
|
||||||
|
Please be respectful to maintainers and disclose AI assistance.
|
||||||
|
|
||||||
## Set up your development environment
|
## Set up your development environment
|
||||||
|
|
||||||
We use [uv](https://github.com/astral-sh/uv) to manage python dependencies and virtual environments.
|
We use [uv](https://github.com/astral-sh/uv) to manage python dependencies and virtual environments.
|
||||||
|
|
|
||||||
Loading…
Add table
Add a link
Reference in a new issue