-
Notifications
You must be signed in to change notification settings - Fork 754
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
basicpy nf-test #7772
base: master
Are you sure you want to change the base?
basicpy nf-test #7772
Conversation
then { | ||
assertAll( | ||
{ assert process.success }, | ||
{ assert snapshot(process.out).match() } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
{ assert snapshot(process.out).match() } | |
{ assert snapshot(process.out.versions, | |
file(process.out.fields.get(0).get(1)).list().sort()).match() } |
Maybe we can try this to see if the order of files changes?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I was able to check the order with process.out.fields.flatten().sort(),
but the github actions test have the correct md5sum for the dfp outputs, it's just that ffp was consistently different.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Then I'd suggest to swap to multiple output channels. Thats cleaner any ways so one for ffp one for dfp, mark both optional.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the tip, I've updated the output specification to:
output:
tuple val(meta), path("*-dfp.tiff"), path("*-ffp.tiff"), emit: profiles
The md5sum is still exactly the same as before - I don't see how this will affect the difference between the files produced by the github runners and in the gitpod environment. Is there potentially any way to check the output file from the action runners so I could run diff to see the difference?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You will need to do two output channels so
output:
tuple val(meta), path("*-dfp.tiff"), emit: dfp
tuple val(meta), path("*-ffp.tiff"), emit: ffp
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'll check in later again though. I have some ideas.
As with some other modules, I could just check for the presence of output files, but I think it's essential to understand where the difference in the "cycif-tonsil-cycle1.ome-ffp.tiff" md5sums between github runners and other environments comes from. Especially since both Docker and Singularity produce reproducible results when run on the same platform. |
Can you somehow print the tiff file? Its an image format right? |
Hey I think we should try this plugin: https://github.com/nf-core/nft-tiff |
I think it would be easier to understand the source of the difference if I could access or reproduce the outputs from runners here, would that be possible somehow? |
MAybe with codespaces? |
Thanks for highlighting Codespaces! I actually got a third md5sum value there (again reproducible across Docker and Singularity) but these files I could now access. The difference seems to come from floating point precision issues |
I've updated the test to check for existence and size of output files - since this is a floating point error, is this appropriate? Additionally, testing in the mcmicro workflow adds a groovy script: https://github.com/nf-core/mcmicro/blob/dev/tests/lib/ImageUtils.groovy to extract the metadata which gets validated. Could something similar be added to module testing? |
Attempt 3:
New PR opened due to conflicts.
In Gitpod, tests pass when running
nf-core modules test BASICPY --profile docker
andnf-core modules test BASICPY --profile singularity
PR checklist
Closes #7539
versions.yml
file.label
nf-core modules test <MODULE> --profile docker
nf-core modules test <MODULE> --profile singularity
nf-core modules test <MODULE> --profile conda
nf-core subworkflows test <SUBWORKFLOW> --profile docker
nf-core subworkflows test <SUBWORKFLOW> --profile singularity
nf-core subworkflows test <SUBWORKFLOW> --profile conda