Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Need a way to verify flaky golden test fixes #111325

Closed
@yjbanov

Description

@yjbanov

Currently if a golden test is flaky we just skip it, e.g.:

Let's say the flake is fixed (perhaps in Skia or in the engine). Skia Gold has a handy feature that, for a given test, shows how stable generated goldens are by giving each golden variant a unique color. For example, in the screenshot below the black, orange, and green circles indicate that the test generated three variations of a golden, i.e. it is flaky:

Screen Shot 2022-09-02 at 1 51 39 PM

A non-flaky golden test will show a continuous string of dots of the same color, e.g.:

Screen Shot 2022-09-09 at 5 28 18 PM

Unfortunately, we can't use this feature because when we skip a test we stop sending to Skia Gold entirely. The only option is to speculatively unskip the test and hope that it's no longer flaky. The cost of a mistake is closed tree, P0s, wasted time, and other sadness.

Feature request

Add an optional parameter to matchesGoldenFile: { bool isFlaky = false }. When set to true we continue generating the golden, and we continue sending it to Skia Gold, but we don't fail the test. This has the same effect as skipping it, but it allows us to monitor it over time, and when the flake is fixed the isFlaky argument can be removed.

Additionally, flutter test could print a warning to the console about the flaky golden, and we can include these in our technical debt calculation.

Metadata

Metadata

Assignees

Labels

P1High-priority issues at the top of the work listc: contributor-productivityTeam-specific productivity, code health, technical debt.c: flakeTests that sometimes, but not always, incorrectly passinfra: auto flake botIssues with the bot that files flake issuesteam-infraOwned by Infrastructure team

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions