(address . bug-mumi@gnu.org)
The mumi CLI must provide features to help go through a review
checklist. A system that allows even those without commit access to
contribute meaningfully to the review process could be a big win. We can
use this issue to brainstorm ideas. One idea could be as follows.
# Idea 1
Projects provide a review checklist in their .mumi/config. For example,
something like
((review-checklist . (((name . good-commit-message)
(description . "Are the commit messages written well?")
(tag . review-good-commit-message))
((name . good-synopsis-description)
(description . "Are the synopsis and description written well?")
(tag . review-good-synopsis-description))
((name . tests-run)
(description . "Are the package tests being run (if available)?")
(tag . review-tests-run))
[…]))
[…])
When a reviewer checks one of these items (say the good-commit-message),
they run something like
$ mumi review --tick good-commit-message
and that sets the review-good-commit-message tag on the issue.
We could also have a status command like
$ mumi review --status
that lists the complete checklist with a tick mark by items that have
been checked.
This system is really a convenience wrapper around tagging. So, it can
be searched with something like
$ mumi search tag:review-good-commit-message
One might however argue that such searching is not very useful.
One possible downside is that this ties each project (guix, mumi, etc.)
to a single checklist. For example, what about guix patches that are not
for packages? Perhaps it is an idea to allow multiple checklists per
project.
Another downside is that this does not provide for multiple reviewers to
review and verify each other's findings. In other words, there is no way
for two reviewers to register that they both verified something
independently.
# Idea 2
A second much simpler idea is to implement templates for `mumi
compose'. Projects provide templates under .mumi/templates. For example,
$ cat .mumi/templates/review
[ ] Are the commit messages written well?
[ ] Are the synopsis and description written well?
[ ] Are the package tests being run (if available)?
Then, when reviewers review something, they compose an email like so
$ mumi compose --template review
that composes an email with this template. The reviewer puts an 'x' by
items they have checked.
The downside of this method is that this is just unstructured
text. There is no way for mumi to understand what parts of the review
checklist have been completed and thus generate useful reports,
filtering, etc.
Thank you for listening to my brain dump! Suggestions welcome.
Cheers! :-)