Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Conversation

@Lawtrohux
Copy link
Member

@Lawtrohux Lawtrohux commented Jun 3, 2025

I'm going to preface this to say this is an absolute nightmare to fix, and I'm sure we will have another crack at this in the future, but at it's minimum the aim is to get this in to ensure this is at least somewhat fixed before deploy. May be improved in the future.

Essentially, two users (ler1211 and iRedi) came to the taiko ppc upon encountering an gamebreaking issue. At its core, you are able to shift star rating massively by offsnapping objects constantly, not only is this a grey area in the beatmap guidelines, but also pretty undetectable apart from its large variation in SR.

This was first discovered by me internally here, but as an oversight, I had no idea how bad someone could make the issue, as a 0.02 difference didn't seem erogenous enough to attempt to fix.

The aim of this PR is the normalise the deltatime values going into our grouping logic. This is to ensure that there is:

  • No variations in both group duration and frequency
  • An inability to change SR with the exact same object placing apart from offsnaps

The downsides of this are:

  • It's still not 1:1, however in our sample maps (credits to ler and iredi), this was gotten down to a 0.02 or 0.00 SR difference, from a near 2* difference in the extreme cases.

This took tearing apart most of the code to figure out and find a way to do this. Credits to Babysnakes for the original idea, after I discovered the exact point the issue lay in. This is done by a new utility, a normaliser, which enables us to take large parts of data and normalise their values using traditional median methods.

If there are any better ideas, I certainly won't object, this was the best I could come up with that was the least invasive (and least performance taxing) way I could think of without rewriting rhythm for the third time.

A huismetbenen branch is located here

Copy link
Member

@buyaspacecube buyaspacecube left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

code works and looks good to me, even if more elegant ways to do it come up eventually getting this in before deploy is pretty important. i do hope rhythm can be less complicated in the future but i wouldn't wish it on you to rewrite it again haha

@buyaspacecube buyaspacecube self-requested a review June 9, 2025 15:40
Copy link
Member

@buyaspacecube buyaspacecube left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what i said above 👍

@buyaspacecube buyaspacecube moved this from Pending Review to Pending Maintainers Review in Difficulty calculation changes Jun 9, 2025
@stanriders
Copy link
Member

!diffcalc
RULESET=taiko
OSU_A=https://github.com/ppy/osu/tree/pp-dev
OSU_B=#33403

@github-actions
Copy link

Difficulty calculation failed: https://github.com/ppy/osu/actions/runs/15765453424

@stanriders
Copy link
Member

!diffcalc
RULESET=taiko
OSU_A=https://github.com/ppy/osu/tree/pp-dev
OSU_B=#33403

@github-actions
Copy link

@stanriders
Copy link
Member

Are convert buffs expected? Doesn't seem like it from the change description

@Lawtrohux
Copy link
Member Author

It was anticipated that such things would occur, due to the convert process and also the odd intervals that come along as a consequence. @buyaspacecube confirm?

@buyaspacecube
Copy link
Member

the sheet above is still really busted for some reason, the pp_master values are way too low. @stanriders can we get pp dev merged in and another sheet if possible? i'm expecting some tiny buffs as this changes rhythm grouping and some notes could now be placed in harder groups in theory

@stanriders
Copy link
Member

!diffcalc
RULESET=taiko
OSU_A=https://github.com/ppy/osu/tree/pp-dev
OSU_B=#33403

@github-actions
Copy link

@stanriders
Copy link
Member

Seems to be better @buyaspacecube

@buyaspacecube
Copy link
Member

yeah this looks right, we can pretty easily balance around the buffs and nerfs

@Lawtrohux
Copy link
Member Author

happy with new sheet too.


double modalDelta = normalisedHitObjectDeltaTime.Count > 0
? normalisedHitObjectDeltaTime
.Select(deltaTime => Math.Round(deltaTime))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What is the point of this..? Shouldn't the normaliser already decently group similar deltas together?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

its main intention is to ensure there isn't any 'noise' or outliers, basically as a secondary check by gaining the mode

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could this be avoided by doing Math.Round in the normaliser itself?

@tsunyoku tsunyoku enabled auto-merge (squash) August 28, 2025 13:15
@tsunyoku tsunyoku merged commit 087f056 into ppy:pp-dev Aug 28, 2025
3 of 8 checks passed
@github-project-automation github-project-automation bot moved this from Pending Maintainers Review to Pending Deploy in Difficulty calculation changes Aug 28, 2025
@Lawtrohux Lawtrohux deleted the delta-normal branch August 28, 2025 13:20
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

Status: Pending Deploy

Development

Successfully merging this pull request may close these issues.

4 participants