-
-
Notifications
You must be signed in to change notification settings - Fork 2.5k
Implement deltatimenormaliser into rhythm grouping logic
#33403
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
code works and looks good to me, even if more elegant ways to do it come up eventually getting this in before deploy is pretty important. i do hope rhythm can be less complicated in the future but i wouldn't wish it on you to rewrite it again haha
osu.Game.Rulesets.Taiko/Difficulty/Preprocessing/Rhythm/Data/SameRhythmHitObjectGrouping.cs
Outdated
Show resolved
Hide resolved
osu.Game.Rulesets.Taiko/Difficulty/Utils/DeltatimeNormaliser.cs
Outdated
Show resolved
Hide resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
what i said above 👍
|
!diffcalc |
|
Difficulty calculation failed: https://github.com/ppy/osu/actions/runs/15765453424 |
|
!diffcalc |
|
Are convert buffs expected? Doesn't seem like it from the change description |
|
It was anticipated that such things would occur, due to the convert process and also the odd intervals that come along as a consequence. @buyaspacecube confirm? |
|
the sheet above is still really busted for some reason, the pp_master values are way too low. @stanriders can we get pp dev merged in and another sheet if possible? i'm expecting some tiny buffs as this changes rhythm grouping and some notes could now be placed in harder groups in theory |
|
!diffcalc |
|
Seems to be better @buyaspacecube |
|
yeah this looks right, we can pretty easily balance around the buffs and nerfs |
|
happy with new sheet too. |
osu.Game.Rulesets.Taiko/Difficulty/Preprocessing/Rhythm/Data/SameRhythmHitObjectGrouping.cs
Outdated
Show resolved
Hide resolved
|
|
||
| double modalDelta = normalisedHitObjectDeltaTime.Count > 0 | ||
| ? normalisedHitObjectDeltaTime | ||
| .Select(deltaTime => Math.Round(deltaTime)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What is the point of this..? Shouldn't the normaliser already decently group similar deltas together?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
its main intention is to ensure there isn't any 'noise' or outliers, basically as a secondary check by gaining the mode
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could this be avoided by doing Math.Round in the normaliser itself?
I'm going to preface this to say this is an absolute nightmare to fix, and I'm sure we will have another crack at this in the future, but at it's minimum the aim is to get this in to ensure this is at least somewhat fixed before deploy. May be improved in the future.
Essentially, two users (ler1211 and iRedi) came to the taiko ppc upon encountering an gamebreaking issue. At its core, you are able to shift star rating massively by offsnapping objects constantly, not only is this a grey area in the beatmap guidelines, but also pretty undetectable apart from its large variation in SR.
This was first discovered by me internally here, but as an oversight, I had no idea how bad someone could make the issue, as a 0.02 difference didn't seem erogenous enough to attempt to fix.
The aim of this PR is the normalise the deltatime values going into our grouping logic. This is to ensure that there is:
The downsides of this are:
This took tearing apart most of the code to figure out and find a way to do this. Credits to Babysnakes for the original idea, after I discovered the exact point the issue lay in. This is done by a new utility, a normaliser, which enables us to take large parts of data and normalise their values using traditional median methods.
If there are any better ideas, I certainly won't object, this was the best I could come up with that was the least invasive (and least performance taxing) way I could think of without rewriting rhythm for the third time.
A huismetbenen branch is located here