Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Conversation

@blp
Copy link
Member

@blp blp commented Jan 28, 2026

…put.

Killing the pipeline then reading the output makes abrupt termination of the HTTP connection possible, which can cause test failures.

This is an experiment to see whether it makes CI more reliable.

…put.

Killing the pipeline then reading the output makes abrupt termination of
the HTTP connection possible, which can cause test failures.

This is an experiment to see whether it makes CI more reliable.

Signed-off-by: Ben Pfaff <[email protected]>
@blp blp requested review from abhizer and ryzhyk January 28, 2026 01:00
@blp blp self-assigned this Jan 28, 2026
@blp blp added bug Something isn't working CI/CD labels Jan 28, 2026
Copilot AI review requested due to automatic review settings January 28, 2026 01:00
@blp blp added QA Testing and quality assurance python Pull requests that update python code labels Jan 28, 2026
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Moves pipeline shutdown in platform tests to occur after consuming output, aiming to avoid abrupt HTTP connection termination and reduce CI flakiness.

Changes:

  • Reordered self.pipeline.stop(...) calls to happen after out.to_pandas()/out.to_dict() across multiple tests.
  • Adjusted multi-phase tests (e.g., test_pandas_map) to stop the pipeline after verifying the first phase’s output.

Comment on lines 189 to +191
df = out.to_pandas()
assert df.shape[0] == 100
self.pipeline.stop(force=True)
Copy link

Copilot AI Jan 28, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

With stop() moved after assertions, a failing assert (or any exception in expected-value construction) will skip self.pipeline.stop(...), potentially leaking a running pipeline and causing follow-on test interference/flakiness. Consider ensuring cleanup always runs by wrapping the output/verification portion in a try/finally (or equivalent test cleanup hook), and/or moving self.pipeline.stop(...) to immediately after out.to_pandas()/out.to_dict() (before assertions) so shutdown still happens even when assertions fail.

Copilot uses AI. Check for mistakes.
@mihaibudiu
Copy link
Contributor

We may need to update the documentation too; you should probably file an issue and assign it to @abhizer

@blp
Copy link
Member Author

blp commented Jan 28, 2026

We may need to update the documentation too; you should probably file an issue and assign it to @abhizer

What kind of documentation update are you envisioning?

@blp blp added this pull request to the merge queue Jan 28, 2026
@mihaibudiu
Copy link
Contributor

The python SDK documentation is bound to have examples like this

Merged via the queue into main with commit 47ab6cd Jan 28, 2026
6 of 7 checks passed
@blp blp deleted the python-integration-tests branch January 28, 2026 03:52
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug Something isn't working CI/CD python Pull requests that update python code QA Testing and quality assurance

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants