-
-
Notifications
You must be signed in to change notification settings - Fork 140
[2.10.x] Avoid running out of memory when parsing heavily nested arrays or objects #1226
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[2.10.x] Avoid running out of memory when parsing heavily nested arrays or objects #1226
Conversation
a1066a5 to
73d4ba8
Compare
| throw new RuntimeException("We should have been reading an object, something got wrong") | ||
| } | ||
|
|
||
| val defaultMaxDepth = 1000 // Same as Jackson's 2.15+ StreamReadConstraints.DEFAULT_MAX_DEPTH |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
fyi, in Play itself we cap the depth for json form handling at just 64: https://github.com/playframework/playframework/blob/3.0.9/core/play/src/main/scala/play/api/data/Form.scala#L375-L381
So I guess 1000 should absolutely not be a problem.
|
|
||
| val defaultMaxDepth = 1000 // Same as Jackson's 2.15+ StreamReadConstraints.DEFAULT_MAX_DEPTH | ||
| // system property to override the max nesting depth for JSON parsing. | ||
| val maxNestingDepth: String = "play.json.parser.maxNestingDepth" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We introduced the same sys property in the main branch's pr when we upgraded jackson to 2.19: https://github.com/playframework/play-json/pull/1072/files
…ects Just like Jackson 2.15+ we restrict the maximum allowed number of nested arrays or objects (or mixed) to 1000. This default can be changed via a sys property. 1000 should be enough for most real world use cases. Note this is about OutOfMemoryError's, not about StackOverflowError's. StackOverflowError's are not a problem since we use a @tailrec optimized method. Therefore this fix is not 100% about CVE-2025-52999 (which in theory we do not run into) but just an additional precaution.
73d4ba8 to
90ad9c3
Compare
|
@Mergifyio backport main 3.0.x |
✅ Backports have been createdDetails
|
…ects (#1226) Just like Jackson 2.15+ we restrict the maximum allowed number of nested arrays or objects (or mixed) to 1000. This default can be changed via a sys property. 1000 should be enough for most real world use cases. Note this is about OutOfMemoryError's, not about StackOverflowError's. StackOverflowError's are not a problem since we use a @tailrec optimized method. Therefore this fix is not 100% about CVE-2025-52999 (which in theory we do not run into) but just an additional precaution. (cherry picked from commit 9722c66)
…ects (#1226) Just like Jackson 2.15+ we restrict the maximum allowed number of nested arrays or objects (or mixed) to 1000. This default can be changed via a sys property. 1000 should be enough for most real world use cases. Note this is about OutOfMemoryError's, not about StackOverflowError's. StackOverflowError's are not a problem since we use a @tailrec optimized method. Therefore this fix is not 100% about CVE-2025-52999 (which in theory we do not run into) but just an additional precaution. (cherry picked from commit 9722c66)
…ects (#1226) (#1228) Just like Jackson 2.15+ we restrict the maximum allowed number of nested arrays or objects (or mixed) to 1000. This default can be changed via a sys property. 1000 should be enough for most real world use cases. Note this is about OutOfMemoryError's, not about StackOverflowError's. StackOverflowError's are not a problem since we use a @tailrec optimized method. Therefore this fix is not 100% about CVE-2025-52999 (which in theory we do not run into) but just an additional precaution. (cherry picked from commit 9722c66) Co-authored-by: Matthias Kurz <[email protected]>
…ects (#1226) Just like Jackson 2.15+ we restrict the maximum allowed number of nested arrays or objects (or mixed) to 1000. This default can be changed via a sys property. 1000 should be enough for most real world use cases. Note this is about OutOfMemoryError's, not about StackOverflowError's. StackOverflowError's are not a problem since we use a @tailrec optimized method. Therefore this fix is not 100% about CVE-2025-52999 (which in theory we do not run into) but just an additional precaution. (cherry picked from commit 9722c66)
…ects (#1226) (#1227) Just like Jackson 2.15+ we restrict the maximum allowed number of nested arrays or objects (or mixed) to 1000. This default can be changed via a sys property. 1000 should be enough for most real world use cases. Note this is about OutOfMemoryError's, not about StackOverflowError's. StackOverflowError's are not a problem since we use a @tailrec optimized method. Therefore this fix is not 100% about CVE-2025-52999 (which in theory we do not run into) but just an additional precaution. (cherry picked from commit 9722c66) Co-authored-by: Matthias Kurz <[email protected]>
Just like Jackson 2.15+ we restrict the maximum allowed number of nested arrays or objects (or mixed) to 1000.
Currently this limit is hardcoded, unlike Jackson, which allows to configure it.(we have sys property now, just like in the main branch as of #1072) 1000 should be enough for most real world use cases. We can still make it configurable later.Note this is about
OutOfMemoryError's, not aboutStackOverflowError's.StackOverflowError's are not a problem since we use a@tailrecoptimized method. Therefore this fix is not 100% about CVE-2025-52999 (which in theory we do not run into) but just an additional precaution.See
(however, again, technically we are not affected by that CVE, since it's about running out of stack. However in theory a bad actor could make an app run out of memory so it's more or less the same - if that bad actor bypasses any other security measurements, like max content size, any body parser max buffers, etc. )