Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

basicpy nf-test #7772

Open
wants to merge 8 commits into
base: master
Choose a base branch
from
Open

basicpy nf-test #7772

wants to merge 8 commits into from

Conversation

kbestak
Copy link
Contributor

@kbestak kbestak commented Mar 11, 2025

Attempt 3:
New PR opened due to conflicts.
In Gitpod, tests pass when running nf-core modules test BASICPY --profile docker and nf-core modules test BASICPY --profile singularity

PR checklist

Closes #7539

  • This comment contains a description of changes (with reason).
  • If you've fixed a bug or added code that should be tested, add tests!
  • If you've added a new tool - have you followed the module conventions in the contribution docs
  • If necessary, include test data in your PR.
  • Remove all TODO statements.
  • Emit the versions.yml file.
  • Follow the naming conventions.
  • Follow the parameters requirements.
  • Follow the input/output options guidelines.
  • Add a resource label
  • Use BioConda and BioContainers if possible to fulfil software requirements.
  • Ensure that the test works with either Docker / Singularity. Conda CI tests can be quite flaky:
    • For modules:
      • nf-core modules test <MODULE> --profile docker
      • nf-core modules test <MODULE> --profile singularity
      • nf-core modules test <MODULE> --profile conda
    • For subworkflows:
      • nf-core subworkflows test <SUBWORKFLOW> --profile docker
      • nf-core subworkflows test <SUBWORKFLOW> --profile singularity
      • nf-core subworkflows test <SUBWORKFLOW> --profile conda

@kbestak kbestak requested a review from a team as a code owner March 11, 2025 09:49
@kbestak kbestak mentioned this pull request Mar 11, 2025
14 tasks
@kbestak kbestak linked an issue Mar 11, 2025 that may be closed by this pull request
then {
assertAll(
{ assert process.success },
{ assert snapshot(process.out).match() }
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
{ assert snapshot(process.out).match() }
{ assert snapshot(process.out.versions,
file(process.out.fields.get(0).get(1)).list().sort()).match() }

Maybe we can try this to see if the order of files changes?

Copy link
Contributor Author

@kbestak kbestak Mar 11, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I was able to check the order with process.out.fields.flatten().sort(), but the github actions test have the correct md5sum for the dfp outputs, it's just that ffp was consistently different.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Then I'd suggest to swap to multiple output channels. Thats cleaner any ways so one for ffp one for dfp, mark both optional.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the tip, I've updated the output specification to:

    output:
    tuple val(meta), path("*-dfp.tiff"), path("*-ffp.tiff"), emit: profiles

The md5sum is still exactly the same as before - I don't see how this will affect the difference between the files produced by the github runners and in the gitpod environment. Is there potentially any way to check the output file from the action runners so I could run diff to see the difference?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You will need to do two output channels so

    output:
    tuple val(meta), path("*-dfp.tiff"), emit: dfp
    tuple val(meta), path("*-ffp.tiff"), emit: ffp

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'll check in later again though. I have some ideas.

@kbestak
Copy link
Contributor Author

kbestak commented Mar 11, 2025

As with some other modules, I could just check for the presence of output files, but I think it's essential to understand where the difference in the "cycif-tonsil-cycle1.ome-ffp.tiff" md5sums between github runners and other environments comes from. Especially since both Docker and Singularity produce reproducible results when run on the same platform.

@famosab
Copy link
Contributor

famosab commented Mar 11, 2025

Can you somehow print the tiff file? Its an image format right?

@famosab
Copy link
Contributor

famosab commented Mar 11, 2025

Hey I think we should try this plugin: https://github.com/nf-core/nft-tiff
(still seems under development though :/)

@kbestak
Copy link
Contributor Author

kbestak commented Mar 11, 2025

I think it would be easier to understand the source of the difference if I could access or reproduce the outputs from runners here, would that be possible somehow?

@famosab
Copy link
Contributor

famosab commented Mar 11, 2025

MAybe with codespaces?

@kbestak
Copy link
Contributor Author

kbestak commented Mar 11, 2025

Thanks for highlighting Codespaces! I actually got a third md5sum value there (again reproducible across Docker and Singularity) but these files I could now access.

The difference seems to come from floating point precision issues

@kbestak
Copy link
Contributor Author

kbestak commented Mar 11, 2025

I've updated the test to check for existence and size of output files - since this is a floating point error, is this appropriate?

Additionally, testing in the mcmicro workflow adds a groovy script: https://github.com/nf-core/mcmicro/blob/dev/tests/lib/ImageUtils.groovy to extract the metadata which gets validated. Could something similar be added to module testing?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: Needs help
Development

Successfully merging this pull request may close these issues.

nf-test migration: basicpy
2 participants