-
Notifications
You must be signed in to change notification settings - Fork 65
Add more nf-tests #252
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add more nf-tests #252
Conversation
|
Warning Newer version of the nf-core template is available. Your pipeline is using an old version of the nf-core template: 3.3.1. For more documentation on how to update your pipeline, please see the nf-core documentation and Synchronisation documentation. |
|
Hm I dont get why tests are cancelled without apparent reason. Thats new to me (well, only the 3rd pipeline where I am adding tests...) Automatically triggered run - shards 5/9 & 9/9 cancelled (but succeeded with |
|
Great Job!!! All tests passed. Let me know when this PR is ready for review. |
|
Thanks. I'll trigger the tests once or twice more to ascertain stable files and file content, but I think its ready for review. |
|
No failed tests with several triggers, so I think its good to go. |
Daniel-VM
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Awesome. nf-tests integrated perfectly (no tests failed). Well done.
(We will continue improving the stability of the output files generated by Dragonflye and other modules in future versions.)
I integrated all test profiles into nf-test tests.
My attempt was the following:
(1) run all tests with
nf-test test . --profile=+singularity --update-snapshot --clean-snapshot --verbose(2) check snapshots, entries that deviated add to one of
(2a)
tests/.nftignore(majority of files, when I assumed or experienced that md5sums varied was across several tests)(2b)
tests/.nftignore_files_entirely(file that occurred only sometimes or had variable names, e.g. that contained date/time or database files)(2c)
tests/*.nf.testparamignore:(~line 20) (when I assumed or experienced files with variable md5sums in only one/few tests)(3) repeat (1) and (2) until there is no change any more
Notable mentions of (2b) are: busco downloads & database hmm & fasta files; ToulligQC output folders contain date & time, as well as NanoPlot log file.
I never checked why those files have unstable md5sums, potentially its worth checking in some cases.
I might have been a little overzealous with not checking file md5sums as in (2a), possible a few more fall into (2c). If there are some specific key files that do not vary (check beforehand!), those could be accommodated.
I think (experienced it, but didnt look more closely, i.e. I might be wrong) that dragonflye assembly is unstable (and therefore all downstream output), as well as Medaka output, but miniasm & Unicycler assemblies are stable. Potentially its worth looking into why and at what point those differ, maybe its possible to change a param or seed to make those results stable.
My plan is to run that tests here also at least 2 times to see if there is any more deviation.
PR checklist
nf-core pipelines lint).nextflow run . -profile test,docker --outdir <OUTDIR>).nextflow run . -profile debug,test,docker --outdir <OUTDIR>).docs/usage.mdis updated.docs/output.mdis updated.CHANGELOG.mdis updated.README.mdis updated (including new tool citations and authors/contributors).