-
Notifications
You must be signed in to change notification settings - Fork 2
Feature/add unit test scaffolding #61
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Samplesheet validation and AIRR conversion.
pass before integration test runs.
Unit Test Results10 tests 10 ✅ 2m 50s ⏱️ Results for commit f514062. ♻️ This comment has been updated with latest results. |
minimal-example pipeline run. Artifact too large
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR introduces comprehensive test infrastructure using nf-test for the Nextflow pipeline. The changes add unit tests for core modules (SAMPLESHEET_CHECK and CONVERT_ADAPTIVE), create test fixtures with sample data, and enhance the CI/CD workflow to run automated tests on pull requests.
Key changes:
- Adds nf-test unit tests for samplesheet validation and Adaptive format conversion modules
- Creates test fixtures including valid, empty, malformed, and missing-column samplesheets, plus Adaptive TCR data files
- Restructures GitHub Actions workflow to run separate unit tests and pipeline integration tests with JUnit XML reporting
Reviewed Changes
Copilot reviewed 12 out of 13 changed files in this pull request and generated 3 comments.
Show a summary per file
| File | Description |
|---|---|
| tests/modules/local/samplesheet/samplesheet_check.nf.test | Unit tests for samplesheet validation covering valid, empty, malformed, and minimal-example inputs |
| tests/modules/local/airr_convert/convert_adaptive.nf.test | Unit tests for Adaptive format conversion including metadata preservation, gene call validation, and error handling |
| tests/fixtures/valid_samplesheet.csv | Test fixture with valid 4-sample samplesheet data |
| tests/fixtures/empty_samplesheet.csv | Test fixture with header-only samplesheet for edge case testing |
| tests/fixtures/malformed_samplesheet.csv | Test fixture with malformed CSV data for error handling validation |
| tests/fixtures/missing_columns_samplesheet.csv | Test fixture missing required columns for validation testing |
| tests/fixtures/data/PD1_Patient*.tsv | Test fixtures with Adaptive TCR format data subsets for conversion testing |
| tests/fixtures/data/malformed_adaptive.tsv | Malformed Adaptive data for error handling tests |
| .gitignore | Adds nf-test artifacts and output directories to ignore list |
| .github/workflows/nextflow-test.yaml | Refactored CI workflow with separate unit-tests and pipeline-test jobs, adds test result publishing and artifact uploading |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| - name: run pipeline | ||
| run: nf-test test | ||
| - name: Run pipeline test | ||
| run: nf-test test tests/main.nf.test --verbose --junitxml=pipeline-test-results.xml |
Copilot
AI
Nov 6, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The pipeline-test job generates pipeline-test-results.xml but doesn't have steps to publish or upload these results, unlike the unit-tests job (lines 39-53). Consider adding similar steps to publish and upload the pipeline test results for consistency and better CI/CD visibility.
| run: nf-test test tests/main.nf.test --verbose --junitxml=pipeline-test-results.xml | |
| run: nf-test test tests/main.nf.test --verbose --junitxml=pipeline-test-results.xml | |
| - name: Publish pipeline test results | |
| if: always() | |
| uses: EnricoMi/publish-unit-test-result-action@v2 | |
| with: | |
| files: pipeline-test-results.xml | |
| check_name: Pipeline Test Results | |
| - name: Upload pipeline test results | |
| if: always() | |
| uses: actions/upload-artifact@v4 | |
| with: | |
| name: pipeline-test-results | |
| path: | | |
| .nf-test/tests/ | |
| pipeline-test-results.xml |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We would normally keep the --junitxml resulting artifact from Integration like tests. So in the ideal scenario we should keep this option. BUT, the integration tests here use a relatively large dataset, so the resulting artifact generated by the tests is quite large. Gigabytes in size IIRC.
The correct solution is to use a dataset that is a bit more truncated that the one in the minimal-example
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
dimalvovs
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice work on adding module tests!
There is one copilot comment that looks legit, should we accept proposed changes?
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
SAMPLESHEET_CHECKandCONVERT_ADAPTIVEnextflow processes.Once the scaffolding is set up we can write some unit tests for #60