Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Conversation

@d4straub
Copy link
Collaborator

@d4straub d4straub commented Jul 8, 2025

I integrated all test profiles into nf-test tests.

My attempt was the following:
(1) run all tests with nf-test test . --profile=+singularity --update-snapshot --clean-snapshot --verbose
(2) check snapshots, entries that deviated add to one of
(2a) tests/.nftignore (majority of files, when I assumed or experienced that md5sums varied was across several tests)
(2b) tests/.nftignore_files_entirely (file that occurred only sometimes or had variable names, e.g. that contained date/time or database files)
(2c) tests/*.nf.test param ignore: (~line 20) (when I assumed or experienced files with variable md5sums in only one/few tests)
(3) repeat (1) and (2) until there is no change any more

Notable mentions of (2b) are: busco downloads & database hmm & fasta files; ToulligQC output folders contain date & time, as well as NanoPlot log file.

I never checked why those files have unstable md5sums, potentially its worth checking in some cases.
I might have been a little overzealous with not checking file md5sums as in (2a), possible a few more fall into (2c). If there are some specific key files that do not vary (check beforehand!), those could be accommodated.

I think (experienced it, but didnt look more closely, i.e. I might be wrong) that dragonflye assembly is unstable (and therefore all downstream output), as well as Medaka output, but miniasm & Unicycler assemblies are stable. Potentially its worth looking into why and at what point those differ, maybe its possible to change a param or seed to make those results stable.

My plan is to run that tests here also at least 2 times to see if there is any more deviation.

PR checklist

  • This comment contains a description of changes (with reason).
  • If you've fixed a bug or added code that should be tested, add tests!
  • If you've added a new tool - have you followed the pipeline conventions in the contribution docs
  • If necessary, also make a PR on the nf-core/bacass branch on the nf-core/test-datasets repository.
  • Make sure your code lints (nf-core pipelines lint).
  • Ensure the test suite passes (nextflow run . -profile test,docker --outdir <OUTDIR>).
  • Check for unexpected warnings in debug mode (nextflow run . -profile debug,test,docker --outdir <OUTDIR>).
  • Usage Documentation in docs/usage.md is updated.
  • Output Documentation in docs/output.md is updated.
  • CHANGELOG.md is updated.
  • README.md is updated (including new tool citations and authors/contributors).

@nf-core-bot
Copy link
Member

Warning

Newer version of the nf-core template is available.

Your pipeline is using an old version of the nf-core template: 3.3.1.
Please update your pipeline to the latest version.

For more documentation on how to update your pipeline, please see the nf-core documentation and Synchronisation documentation.

@d4straub
Copy link
Collaborator Author

d4straub commented Jul 8, 2025

Hm I dont get why tests are cancelled without apparent reason. Thats new to me (well, only the 3rd pipeline where I am adding tests...)

Automatically triggered run - shards 5/9 & 9/9 cancelled (but succeeded with latest-everything)
Manually re-triggered shards 5/9 & 9/9 - passed
Manually re-trigger all checks (1) - passed
Manually re-trigger all checks (2) - passed
Manually re-trigger all checks (3) - passed

@Daniel-VM
Copy link
Contributor

Great Job!!! All tests passed. Let me know when this PR is ready for review.

@d4straub
Copy link
Collaborator Author

d4straub commented Jul 9, 2025

Thanks. I'll trigger the tests once or twice more to ascertain stable files and file content, but I think its ready for review.

@d4straub d4straub marked this pull request as ready for review July 9, 2025 07:04
@d4straub
Copy link
Collaborator Author

d4straub commented Jul 9, 2025

No failed tests with several triggers, so I think its good to go.

Copy link
Contributor

@Daniel-VM Daniel-VM left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Awesome. nf-tests integrated perfectly (no tests failed). Well done.

(We will continue improving the stability of the output files generated by Dragonflye and other modules in future versions.)

@d4straub d4straub merged commit f3d3813 into nf-core:dev Jul 9, 2025
102 of 124 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants