QA/QC Data/Model issue templates with tasks to up the professionalism and utility of Riverscape Model and Data products #322
-
@philipbaileynar, about a year a go when we piloted the idea of the webstore, you and I discussed the need for a consistent workflow and ideally checklist of all the tasks that need to happen for a model run to actually be complete. Right now you vaguely just say to "produce a run of BRAT". While this may have sufficed up until now, it does leave it up to judgement what if anything else gets communicated and leaves the manual and critical parts of QA/QC of those projects typically never done. It is not good enough. We have simply gotten away with it. In the time we shelfed the idea of formalizing the QA/QC process (I think we were imagining all sorts of horrible BaseCamp and/or custom database solutions) and didn't completely follow through on that aspiration, I think GitHub has advanced with tasks with an issues (including tasks that can be split out into new issues) and provide us a way with issue templates to realize this. I think we should make a template (or so) over here on the riverscapes-tools repo to provide a more consistent and thorough process. While you have successfully used the Please see this discussion on the riverscapes-xml where I turn all the instructions we gave the curators in the workshop last week into repeatable, actionable workflows through GIT and see what you think. Thoughts? Lets discuss merits here and agree on what template tasks should be so it is the right balance of what we need without being too cumbersome. Ultimately a link to the issue could be a useful part of the metadata of projects requested and run this way (i.e. manual request) such that it documents the QA/QC and curation of that project and gives credit and details of what was done. Sorry to be annoying, but I think this an easy incremental step for upping our game and making those "store" model purchases meet a standard beyond "well automated model ran without throwing any errors" (I know that's the part @MattReimer cares about). Well, I want a real-set of eyes on it so that it is actually producing something that makes sense (right and wrong answers for right reasons) and is of practical value (i.e. worth charging people for). |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 3 replies
-
Hmmm. I definitely want to keep all the QA/QC churn out of github if possible unless it's about bugs and fixing the tool. If you want to link to anything on Github may I suggest the release notes for the version the tool was run on? Are we talking about having a signoff process for every project in the system? Don't get me wrong, I love the idea but it sounds like a massive undertaking. Ideally the QA/QC process would involve downloading the project, runninng it through some sort of checklist then uploading the changes (if any) back along with a project file that has been augmented with some sort of QA/QC status flag, a signature (or just user id) of whoever did the work and any comments related to the work itself. We would probably need an improvement on how we download and upload projects. Our NPM CLI is no slouch but I think it needs a few more guard rails or to be written in some other technology entirely before unleashing it on any non-devs. All of it is possible and the XML was definitely built for this kind of thing. We just need to figure out the scope, the process, the rules and the timing. |
Beta Was this translation helpful? Give feedback.
Hmmm. I definitely want to keep all the QA/QC churn out of github if possible unless it's about bugs and fixing the tool. If you want to link to anything on Github may I suggest the release notes for the version the tool was run on?
Are we talking about having a signoff process for every project in the system? Don't get me wrong, I love the idea but it sounds like a massive undertaking.
Ideally the QA/QC process would involve downloading the project, runninng it through some sort of checklist then uploading the changes (if any) back along with a project file that has been augmented with some sort of QA/QC status flag, a signature (or just user id) of whoever did the work and any comments …