-
Notifications
You must be signed in to change notification settings - Fork 48
fix allocation of very large blocks #77
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the PR!
Glad to see you've made the FLAC parser a bit more resilient against malicious FLAC streams.
I've added a few comments to increase the threshold limits. I do believe you set quite reasonable limits, but lets up them a bit to ensure no valid use cases are limited by the thresholds.
Cheers,
Robin
|
hi @mewmew, thank you for the review. I've increased the limits as you suggested. I agree that it's important to not break existing valid use cases. I think that even with higher limits it's still difficult to exploit the lib, except maybe for some large scale high-throughput scenarios. |
|
Thanks @whisk! I've now merged the change :) Wish you a lovely late summer and happy coding! |
|
Hmm, at least locally the test cases fail after merging this PR: @whisk, do you have the chance to take a look? And perhaps adjust the thresholds if needed or make the error for these test cases a valid success? |
|
@mewmew, I see that "testdata" is missing some of the test files, for example:
In any case, it's still not a problem to further increase the limits for the number of seekpoints and number of tags. Please see #78 for this. |
The |
The release date of v1.0.14 is yet to be announced. It will likely wait until other features and/or fixes are merged. For those that depend on the more robust parsing introduced in #77, pin to a specific commit version for now.
Description
Corrupt or malicious files may declare unreasonable large metadata blocks even if the actual data itself much smaller. Parser allocates buffers of the declared size when reading such files, and that may lead to out-of-memory errors.
Ways to reproduce
I added testcase
TestVorbisCommentTooManyTagsOOMto demostrate slowness and/or OOM after several iterations. In this PR it will pass because of the fix.Fix
I added upper thresholds for number of tags in comment, number of seek points and raw picture size. If that threshold is hit, parsing fails with a
ErrDeclaredBlockTooBigtype error.Thresholds are loose, and should accomodate any practical use case I can imagine. However, it should be difficult to exploit them now.