Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Conversation

@pvillard31
Copy link
Contributor

Summary

NIFI-15408 - Explore parallel builds to reintroduce integration tests in code coverage

Tracking

Please complete the following tracking steps prior to pull request creation.

Issue Tracking

Pull Request Tracking

  • Pull Request title starts with Apache NiFi Jira issue number, such as NIFI-00000
  • Pull Request commit message starts with Apache NiFi Jira issue number, as such NIFI-00000
  • Pull request contains commits signed with a registered key indicating Verified status

Pull Request Formatting

  • Pull Request based on current revision of the main branch
  • Pull Request refers to a feature branch with one commit containing changes

Verification

Please indicate the verification steps performed prior to pull request creation.

Build

  • Build completed using ./mvnw clean install -P contrib-check
    • JDK 21
    • JDK 25

Licensing

  • New dependencies are compatible with the Apache License 2.0 according to the License Policy
  • New dependencies are documented in applicable LICENSE and NOTICE files

Documentation

  • Documentation formatting appears as expected in rendered files

@pvillard31 pvillard31 marked this pull request as ready for review December 30, 2025 22:53
Copy link
Contributor

@exceptionfactory exceptionfactory left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for working on this @pvillard31.

As far as pulling the images before running tests, does that provide an actual gain over pulling them when needed? It seems like the images would be cached as a result of pulling when running Testcontainers, but I may be missing something.

Reviewing the initial results it looks like the flagged results show improvements in coverage.

As far as the structure, does this mean that new integration tests will need to be listed in the include matrix, or only those with container-based tests?

@pvillard31 pvillard31 marked this pull request as draft December 31, 2025 10:44
@pvillard31
Copy link
Contributor Author

Thanks @exceptionfactory - I didn't mean to remove the draft status on the PR yesterday and I just wanted to let it run to check the results this morning. The integration tests didn't run properly so I'm fixing that and will look at improving things. I do agree with you that caching does not seem useful and would add a maintainability burden.

@exceptionfactory
Copy link
Contributor

Thanks for the update @pvillard31, feel free to ping once it is ready for review!

@pvillard31
Copy link
Contributor Author

I'll explore how we could potentially include system tests in the code coverage but that seems to be a significant effort so I'll be doing that in a separate/follow-up issue. I think this is good as-is for now. Thanks @exceptionfactory !

@pvillard31 pvillard31 marked this pull request as ready for review December 31, 2025 18:16
Copy link
Contributor

@exceptionfactory exceptionfactory left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @pvillard31, this looks more succinct. I noted a couple minor recommendations.

Comment on lines 124 to 130
- name: Find Coverage Reports
id: find-reports
if: always()
run: |
REPORTS=$(find . -path "*/target/site/jacoco/jacoco.xml" -type f | tr '\n' ',' | sed 's/,$//')
echo "reports=$REPORTS" >> $GITHUB_OUTPUT
echo "Found $(echo "$REPORTS" | tr ',' '\n' | wc -l) coverage reports"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this step necessary? It seems like some coverage reports should always be produced, and avoiding scripting is better for maintainability.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you're right I think this is a leftover that I didn't remove after trying the matrix build approach

Copy link
Contributor

@exceptionfactory exceptionfactory left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the updates @pvillard31, this looks closer to completion, but needs a few more adjustments to the integration tests job.

@pvillard31 pvillard31 marked this pull request as draft January 2, 2026 17:01
@pvillard31 pvillard31 marked this pull request as ready for review January 3, 2026 11:22
@pvillard31
Copy link
Contributor Author

pvillard31 commented Jan 3, 2026

OK this time, we should be good, we are back at 52%+ of coverage - cc @exceptionfactory
https://app.codecov.io/gh/apache/nifi/pull/10714/flags

Two key changes:

  • Added prepare-agent-integration to the report-code-coverage profile in the root pom.xml. The original approach of calling jacoco:prepare-agent-integration as a command-line goal didn't work because it only ran once at the reactor level, not for each module. By adding the JaCoCo plugin with the prepare-agent-integration goal to the report-code-coverage profile, it now runs during the initialize phase for every module when the profile is activated, properly attaching the JaCoCo agent to Failsafe integration tests and creating jacoco-it.exec files in each module.
  • Configured report-aggregate in nifi-code-coverage/pom.xml to include integration test coverage files (by default, it only looks for jacoco.exec files from unit tests). The dataFileIncludes allows it to aggregate coverage from both unit tests and integration tests into the final report.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants