Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Conversation

@softins
Copy link
Member

@softins softins commented Mar 22, 2022

Short description of changes

Adds rate-limiting for channel gain change messages.
This avoids a backlog of messages being queued due to ACK latency,
particularly if using a MIDI controller that send fine-grained
level changes. After sending a gain change message, further gain
changes will be updated locally for 300ms a time and then the latest
value sent to the server. This time will be 50ms by default, or double the
current ping time, whichever is greater.

CHANGELOG: Client: Fix potential long delay in sending fader changes to the server.

Context: Fixes an issue?

Fixes #2492

Does this change need documentation? What needs to be documented and how?

No, bug fix only

Status of this Pull Request

Tested and working

What is missing until this pull request can be merged?

Review and testing on other platforms, and by @bawbgale, who reported the issue

Checklist

  • I've verified that this Pull Request follows the general code principles
  • I tested my code and it does what I want
  • My code follows the style guide
  • I waited some time after this Pull Request was opened and all GitHub checks completed without errors.
  • I've filled all the content above

@softins
Copy link
Member Author

softins commented Mar 22, 2022

Tested with single and grouped channels, and the Wireshark trace examined.

src/client.cpp Outdated
if ( newGain[iId] != oldGain[iId] )
{
// send new gain and record as old gain
Channel.SetRemoteChanGain ( iId, oldGain[iId] = newGain[iId] );
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

When does the assignment happen? Before the call or after? I'd rather see it on a separate line so I don't have to think about it.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The assignment expression happens first, and the assigned value is used as the parameter. To me it's a common C idiom, and in this case it also avoids repeating the array lookup again (which the compiler may possibly optimise out, but I like to write optimal code of my own).

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

One could do:

fGain = oldGain[iId] = newGain[iId];
Channel.SetRemoteChanGain ( iId, fGain );

but to me the original is perfectly readable.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I find it readable, but it's easy to miss, IMO.
I would have gone with a simple two-step assignment / method call as I don't think this is in a performance-critical path.
The alternative (reuse fGain) also looks fine to me.

In the end, I don't feel strongly.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK, I've updated it for clarity to use fGain.

@softins
Copy link
Member Author

softins commented Mar 22, 2022

Harmless code alerts in Oboe, but Github won't let me dismiss them.

@ghost
Copy link

ghost commented Mar 22, 2022

I think waiting for 50 milliseconds would give a more realistic response because 300 ms is too long.
Also, the server should make finer level changes for the same amount of time. For example, the client updates level changes every 50 ms if commanded, and the server may smoothly update the received level with several smaller level changes within the following 50 ms.

// start timer so that elapsed time works
PreciseTime.start();

// set gain delay timer to single-shot and connect handler function
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That comment almost repeats the variable/method names and provides little added value, but I guess it's ok.. :)

Comment on lines +352 to +354
// but just stored in newGain[iId], and the minGainId and maxGainId updated to note the range of
// IDs that must be checked when the time expires (this will usually be a single channel
// unless channel grouping is being used). This avoids having to check all possible channels.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As we are talking about worst case n=200 or something and this is not part of the critical path (sound/network processing) and not part of something which scales exponentially (e.g. server-side per-client-per-client stuff), I think I would have gone with the check-all-channels approach in order to keep the logic simpler and shorter.

PS: Now this whole block comment provides value and is well-written! :)

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, but I figured if it is possible at little cost to limit the number of checks needed to the specific one or a few out of the 200 channels, it's worth doing. And of course include an explanation, as I did :)

src/client.cpp Outdated
if ( newGain[iId] != oldGain[iId] )
{
// send new gain and record as old gain
Channel.SetRemoteChanGain ( iId, oldGain[iId] = newGain[iId] );
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I find it readable, but it's easy to miss, IMO.
I would have gone with a simple two-step assignment / method call as I don't think this is in a performance-critical path.
The alternative (reuse fGain) also looks fine to me.

In the end, I don't feel strongly.

@softins
Copy link
Member Author

softins commented Mar 23, 2022

@softins I just tried that build with a server 160ms away and it works much better! MIDI controller changes take effect very quickly. Thank you!

Originally posted by @bawbgale in #2492 (comment)

This avoids a backlog of messages being queued due to ACK latency,
particularly if using a MIDI controller that send fine-grained
level changes. After sending a gain change message, further gain
changes will be updated locally for 300ms and then the latest
value sent to the server.
@hoffie
Copy link
Member

hoffie commented Mar 23, 2022

I've tested this PR and it does what it claims to do. Wireshark looks as expected.

At the same time, this causes very noticable loudness jumps (similar to artifacts) when changing the fader level (tested via streaming music and listening to the sound coming back from the server while playing with the slider of my channel).
One could argue that the most common slider adjustments would be:

  • Slow, but not by large amounts
  • Quick, but only to make a (maybe new) participant less loud quickly
    Therefore, it might not be too relevant.

With 50ms, it's much smoother again, but obviously causes more protocol messages.
@DavidSavinkoff's proposal to add server-side fading would probably work around that, but I don't see a way to add that without introducing new logic into the very critical sound mixing code path (we do have fading for new channels, but I'm not sure if it's possible to adapt).

In essence, I think

  • we should have this PR as it solves a rather nasty problem
  • we should try to use the lowest possible constant for delayed updates

@softins
Copy link
Member Author

softins commented Mar 23, 2022

I think any server-side smoothing of gain changes might be a worthwhile improvement, but should be a separate feature and PR, as it would only apply to servers running new code. The current PR addresses the root issue at the client, which is where it originates, and works when connected to any version of server.

I did wonder about having a smaller default delay period, and then increasing it if necessary based on the ping time of the connected server. Maybe 50ms or 100ms by default, and 1.5x or 2x the ping time if greater (to allow a little extra time)? It sounds like that might be a worthwhile improvement.

@softins
Copy link
Member Author

softins commented Mar 23, 2022

The ping time multiplied by a suitable constant could easily be stored in CClient by CClient::OnCLPingReceived() for use when starting the timer. Or probably just store it and then multiply by the constant when starting the timer.

@softins
Copy link
Member Author

softins commented Mar 23, 2022

I've added a second commit to default the timer to max(50ms, pingtime*2), and refactored the timer starting into a separate function to be DRY. I tested against my private server in London and a server in Hong Kong and compared the packet traces. As expected, when changing a fader, the London server was sent updates at 50ms intervals, and the HK server at 400ms intervals.

@hoffie hoffie added this to the Release 3.9.0 milestone Mar 23, 2022
Copy link
Member

@hoffie hoffie left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks! This behaves much better and looks clean. Nice work :)

@softins softins merged commit c72195f into jamulussoftware:master Mar 23, 2022
hoffie added a commit to hoffie/jamulus that referenced this pull request Jul 17, 2022
PR jamulussoftware#2535 introduced rate limiting for gain change messages. The logic
required storing the previously used gain value per channel. This logic
had two issues:
1. The previously used gain value defaulted to 0, despite the server-side
   view of the channel being set to 1 (as the default). Therefore,
   gain(0) changes during a series of gain changes would be lost. The
   most common scenario would be the initial connection, which always
   triggers the rate limit and therefore the faulty logic. This also
   affected New Client Level = 0.
2. The previously used gain values were not reset upon changing servers.
   This might have caused losing arbitrary gain change messages, e.g.
   stored fader values.

This commit introduces a gain level memory reset to 1 (100%) on connect
to fix both of these issues.

Fixes: jamulussoftware#2730
hoffie added a commit to hoffie/jamulus that referenced this pull request Jul 17, 2022
PR jamulussoftware#2535 introduced rate limiting for gain change messages. The logic
required storing the previously used gain value per channel. This logic
had some flaws:
1. The previously used gain value defaulted to 0, despite the server-side
   view of the channel being set to 1 (as the default). Therefore,
   gain(0) changes during a series of gain changes would be lost. The
   most common scenario would be the initial connection, which always
   triggers the rate limit and therefore the faulty logic. This also
   affected New Client Level = 0.
2. The previously used gain values were not reset upon changing servers.
   This might have caused losing arbitrary gain change messages, e.g.
   stored fader values.
3. The previously used gain values were not reset upon a channel
   disconnect. This might have caused missing fader level restores.

This commit introduces a gain level memory reset to 1 (100%) on connect
as well as on channel disconnects to fix these issues.

Fixes: jamulussoftware#2730
@hoffie hoffie mentioned this pull request Jul 17, 2022
5 tasks
hoffie added a commit to hoffie/jamulus that referenced this pull request Jul 17, 2022
PR jamulussoftware#2535 introduced rate limiting for gain change messages. The logic
required storing the previously used gain value per channel. This logic
had some flaws:
1. The previously used gain value defaulted to 0, despite the server-side
   view of the channel being set to 1 (as the default). Therefore,
   gain(0) changes during a series of gain changes would be lost. The
   most common scenario would be the initial connection, which always
   triggers the rate limit and therefore the faulty logic. This also
   affected New Client Level = 0.
2. The previously used gain values were not reset upon changing servers.
   This might have caused losing arbitrary gain change messages, e.g.
   stored fader values.
3. The previously used gain values were not reset upon a channel
   disconnect. This might have caused missing fader level restores.

This commit introduces a gain level memory reset to 1 (100%) on connect
as well as on channel disconnects to fix these issues.

Fixes: jamulussoftware#2730
hoffie added a commit to hoffie/jamulus that referenced this pull request Jul 17, 2022
PR jamulussoftware#2535 introduced rate limiting for gain change messages. The logic
required storing the previously used gain value per channel. This logic
had some flaws:
1. The previously used gain value defaulted to 0, despite the server-side
   view of the channel being set to 1 (as the default). Therefore,
   gain(0) changes during a series of gain changes would be lost. The
   most common scenario would be the initial connection, which always
   triggers the rate limit and therefore the faulty logic. This also
   affected New Client Level = 0.
2. The previously used gain values were not reset upon changing servers.
   This might have caused losing arbitrary gain change messages, e.g.
   stored fader values.
3. The previously used gain values were not reset upon a channel
   disconnect. This might have caused missing fader level restores.

This commit introduces a gain level memory reset to 1 (100%) on connect
as well as on channel disconnects to fix these issues.

Fixes: jamulussoftware#2730

Co-authored-by: ann0see <[email protected]>
hoffie added a commit to hoffie/jamulus that referenced this pull request Jul 17, 2022
PR jamulussoftware#2535 introduced rate limiting for gain change messages. The logic
required storing the previously used gain value per channel. This logic
had some flaws:
1. The previously used gain value defaulted to 0, despite the server-side
   view of the channel being set to 1 (as the default). Therefore,
   gain(0) changes during a series of gain changes would be lost. The
   most common scenario would be the initial connection, which always
   triggers the rate limit and therefore the faulty logic. This also
   affected New Client Level = 0.
2. The previously used gain values were not reset upon changing servers.
   This might have caused losing arbitrary gain change messages, e.g.
   stored fader values.
3. The previously used gain values were not reset upon a channel
   disconnect. This might have caused missing fader level restores.

This commit introduces a gain level memory reset to 1 (100%) on connect
as well as on channel disconnects to fix these issues.

Fixes: jamulussoftware#2730

Co-authored-by: ann0see <[email protected]>
@softins softins deleted the gain-rate-limit branch August 30, 2023 17:29
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Effect of MIDI fader audio changes lags behind UI changes

3 participants