-
-
Notifications
You must be signed in to change notification settings - Fork 2.5k
fix: add FST_ERR_CTP_INVALID_JSON_BODY #5925
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
How about avoiding creating stacktraces like this? function defaultJsonParser (req, body, done) {
const stackTraceLimit = Error.stackTraceLimit
Error.stackTraceLimit = 0
try {
if (body === '' || body == null || (Buffer.isBuffer(body) && body.length === 0)) {
return done(new FST_ERR_CTP_EMPTY_JSON_BODY(), undefined)
}
const json = parse(body, parserOptions)
done(null, json)
} catch {
return done(new FST_ERR_CTP_INVALID_JSON_BODY(), undefined)
} finally {
Error.stackTraceLimit = stackTraceLimit
}
} |
Signed-off-by: Aras Abbasi <[email protected]>
mcollina
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lgtm
|
@mcollina |
What's the performance difference with the stack trace limitation? |
|
Benchmark case: 'use strict'
const fastify = require('../../fastify')({
logger: false
})
fastify
.post('/', function (req, reply) {
reply
.send({ hello: req.body.hello })
})
fastify.listen({ port: 3000 }, (err, address) => {
if (err) throw err
})sending valid json autocannon -m POST -H content-type=application/json -b '{"hello":"world"}' 127.0.0.1:3000
Running 10s test @ http://127.0.0.1:3000
10 connections
┌─────────┬──────┬──────┬───────┬──────┬─────────┬─────────┬───────┐
│ Stat │ 2.5% │ 50% │ 97.5% │ 99% │ Avg │ Stdev │ Max │
├─────────┼──────┼──────┼───────┼──────┼─────────┼─────────┼───────┤
│ Latency │ 0 ms │ 0 ms │ 0 ms │ 1 ms │ 0.02 ms │ 0.21 ms │ 30 ms │
└─────────┴──────┴──────┴───────┴──────┴─────────┴─────────┴───────┘
┌───────────┬─────────┬─────────┬─────────┬─────────┬─────────┬──────────┬─────────┐
│ Stat │ 1% │ 2.5% │ 50% │ 97.5% │ Avg │ Stdev │ Min │
├───────────┼─────────┼─────────┼─────────┼─────────┼─────────┼──────────┼─────────┤
│ Req/Sec │ 14,583 │ 14,583 │ 23,055 │ 23,535 │ 22,300 │ 2,465.92 │ 14,583 │
├───────────┼─────────┼─────────┼─────────┼─────────┼─────────┼──────────┼─────────┤
│ Bytes/Sec │ 2.74 MB │ 2.74 MB │ 4.34 MB │ 4.42 MB │ 4.19 MB │ 463 kB │ 2.74 MB │
└───────────┴─────────┴─────────┴─────────┴─────────┴─────────┴──────────┴─────────┘
Req/Bytes counts sampled once per second.
# of samples: 11
245k requests in 11.01s, 46.1 MB readsending invalid json on main branch: aras@aras-HP-ZBook-15-G3:~/workspace/fastify$ autocannon -m POST -H content-type=application/json -b '{"hello":"world"' 127.0.0.1:3000
Running 10s test @ http://127.0.0.1:3000
10 connections
┌─────────┬──────┬──────┬───────┬──────┬─────────┬─────────┬───────┐
│ Stat │ 2.5% │ 50% │ 97.5% │ 99% │ Avg │ Stdev │ Max │
├─────────┼──────┼──────┼───────┼──────┼─────────┼─────────┼───────┤
│ Latency │ 0 ms │ 0 ms │ 1 ms │ 1 ms │ 0.09 ms │ 0.34 ms │ 18 ms │
└─────────┴──────┴──────┴───────┴──────┴─────────┴─────────┴───────┘
┌───────────┬─────────┬─────────┬─────────┬─────────┬─────────┬──────────┬─────────┐
│ Stat │ 1% │ 2.5% │ 50% │ 97.5% │ Avg │ Stdev │ Min │
├───────────┼─────────┼─────────┼─────────┼─────────┼─────────┼──────────┼─────────┤
│ Req/Sec │ 11,007 │ 11,007 │ 17,039 │ 17,279 │ 16,420 │ 1,730.68 │ 11,002 │
├───────────┼─────────┼─────────┼─────────┼─────────┼─────────┼──────────┼─────────┤
│ Bytes/Sec │ 3.48 MB │ 3.48 MB │ 5.39 MB │ 5.46 MB │ 5.19 MB │ 547 kB │ 3.48 MB │
└───────────┴─────────┴─────────┴─────────┴─────────┴─────────┴──────────┴─────────┘
Req/Bytes counts sampled once per second.
# of samples: 11
0 2xx responses, 180634 non 2xx responses
181k requests in 11.01s, 57.1 MB readthis branch with my suggested change: function getDefaultJsonParser (onProtoPoisoning, onConstructorPoisoning) {
const parse = secureJson.parse
const parseOptions = { protoAction: onProtoPoisoning, constructorAction: onConstructorPoisoning }
return defaultJsonParser
function defaultJsonParser (req, body, done) {
const stackTraceLimit = Error.stackTraceLimit
Error.stackTraceLimit = 0
try {
if (body === '' || body == null || (Buffer.isBuffer(body) && body.length === 0)) {
return done(new FST_ERR_CTP_EMPTY_JSON_BODY(), undefined)
}
const json = parse(body, parseOptions)
done(null, json)
} catch {
return done(new FST_ERR_CTP_INVALID_JSON_BODY(), undefined)
} finally {
Error.stackTraceLimit = stackTraceLimit
}
}
}aras@aras-HP-ZBook-15-G3:~/workspace/fastify$ autocannon -m POST -H content-type=application/json -b '{"hello":"world"' 127.0.0.1:3000
Running 10s test @ http://127.0.0.1:3000
10 connections
┌─────────┬──────┬──────┬───────┬──────┬────────┬─────────┬───────┐
│ Stat │ 2.5% │ 50% │ 97.5% │ 99% │ Avg │ Stdev │ Max │
├─────────┼──────┼──────┼───────┼──────┼────────┼─────────┼───────┤
│ Latency │ 0 ms │ 0 ms │ 1 ms │ 1 ms │ 0.1 ms │ 0.34 ms │ 20 ms │
└─────────┴──────┴──────┴───────┴──────┴────────┴─────────┴───────┘
┌───────────┬─────────┬─────────┬─────────┬─────────┬──────────┬──────────┬─────────┐
│ Stat │ 1% │ 2.5% │ 50% │ 97.5% │ Avg │ Stdev │ Min │
├───────────┼─────────┼─────────┼─────────┼─────────┼──────────┼──────────┼─────────┤
│ Req/Sec │ 10,951 │ 10,951 │ 17,039 │ 17,375 │ 16,289.1 │ 1,737.52 │ 10,944 │
├───────────┼─────────┼─────────┼─────────┼─────────┼──────────┼──────────┼─────────┤
│ Bytes/Sec │ 3.73 MB │ 3.73 MB │ 5.81 MB │ 5.92 MB │ 5.55 MB │ 592 kB │ 3.73 MB │
└───────────┴─────────┴─────────┴─────────┴─────────┴──────────┴──────────┴─────────┘
Req/Bytes counts sampled once per second.
# of samples: 11
0 2xx responses, 179183 non 2xx responses
179k requests in 11.01s, 61.1 MB readAlternative: const stackTraceLimit = Error.stackTraceLimit
Error.stackTraceLimit = 0
const defaultInvalidJSONError = new FST_ERR_CTP_INVALID_JSON_BODY()
Error.stackTraceLimit = stackTraceLimit
function getDefaultJsonParser (onProtoPoisoning, onConstructorPoisoning) {
const parse = secureJson.parse
const parseOptions = { protoAction: onProtoPoisoning, constructorAction: onConstructorPoisoning }
return defaultJsonParser
function defaultJsonParser (req, body, done) {
if (body === '' || body == null || (Buffer.isBuffer(body) && body.length === 0)) {
return done(new FST_ERR_CTP_EMPTY_JSON_BODY(), undefined)
}
let json
try {
json = parse(body, parseOptions)
} catch (err) {
err.code = 'FST_ERR_CTP_INVALID_JSON_BODY'
err.statusCode = 400
err.message = "Body is not valid JSON but content-type is set to 'application/json'"
Object.setPrototypeOf(err, defaultInvalidJSONError)
return done(err, undefined)
}
done(null, json)
}
}aras@aras-HP-ZBook-15-G3:~/workspace/fastify$ autocannon -m POST -H content-type=application/json -b '{"hello":"world"' 127.0.0.1:3000
Running 10s test @ http://127.0.0.1:3000
10 connections
┌─────────┬──────┬──────┬───────┬──────┬─────────┬─────────┬───────┐
│ Stat │ 2.5% │ 50% │ 97.5% │ 99% │ Avg │ Stdev │ Max │
├─────────┼──────┼──────┼───────┼──────┼─────────┼─────────┼───────┤
│ Latency │ 0 ms │ 0 ms │ 1 ms │ 1 ms │ 0.11 ms │ 0.36 ms │ 20 ms │
└─────────┴──────┴──────┴───────┴──────┴─────────┴─────────┴───────┘
┌───────────┬─────────┬─────────┬─────────┬─────────┬───────────┬─────────┬─────────┐
│ Stat │ 1% │ 2.5% │ 50% │ 97.5% │ Avg │ Stdev │ Min │
├───────────┼─────────┼─────────┼─────────┼─────────┼───────────┼─────────┼─────────┤
│ Req/Sec │ 9,639 │ 9,639 │ 16,607 │ 17,055 │ 16,014.91 │ 2,031.5 │ 9,633 │
├───────────┼─────────┼─────────┼─────────┼─────────┼───────────┼─────────┼─────────┤
│ Bytes/Sec │ 3.28 MB │ 3.28 MB │ 5.66 MB │ 5.82 MB │ 5.46 MB │ 694 kB │ 3.28 MB │
└───────────┴─────────┴─────────┴─────────┴─────────┴───────────┴─────────┴─────────┘
Req/Bytes counts sampled once per second.
# of samples: 11
0 2xx responses, 176182 non 2xx responses
176k requests in 11.01s, 60.1 MB read |
gurgunday
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please don't merge before tonight, I'd like to take a look as well
|
yeah, dont worry. |
I'm ok with the suggestion; I'd say that possibly is not a hot path per-se, so the regression from your stack trace suggestion should be fine. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Really interesting alternatives! Even if it has more code, I think we should mirror secure-json-parse and not generate stack traces for the validation errors. But this version might be problematic as during done, potentially when user code is running, stack traces will still be disabled I believe? We'd need to turn it off just before the error creation and turn it back on right after
function getDefaultJsonParser (onProtoPoisoning, onConstructorPoisoning) {
const parse = secureJson.parse
const parseOptions = { protoAction: onProtoPoisoning, constructorAction: onConstructorPoisoning }
return defaultJsonParser
function defaultJsonParser (req, body, done) {
const stackTraceLimit = Error.stackTraceLimit
Error.stackTraceLimit = 0
try {
if (body === '' || body == null || (Buffer.isBuffer(body) && body.length === 0)) {
return done(new FST_ERR_CTP_EMPTY_JSON_BODY(), undefined)
}
const json = parse(body, parseOptions)
done(null, json)
} catch {
return done(new FST_ERR_CTP_INVALID_JSON_BODY(), undefined)
} finally {
Error.stackTraceLimit = stackTraceLimit
}
}
}There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am thinking is it better to do with fastify-error and provides an option to hide stack trace.
// per-error
createError(code, message [, statusCode [, Base]] [,hideStackTrace])
// globally
const createError = require('@fastify/error')
createError.hideStackTrace = trueThe global option is used to hide stack trace for all fastify error in production.
The per-error option is used to hide or unhide stack trace where we think necessary.
The global option is needed because we always provide the detail of stack trace by default.
It would be better to hide the detail in production mode.
LGTM! |
|
How should we move on? |
|
If @climba03003's suggestion has the same benefits, I'd say that his approach will be a good way to move forward |
|
@Uzlopak would you like to rebase and finish this up? I think we can just set captureStackTrace to false for all validation errors? |
Signed-off-by: Aras Abbasi <[email protected]>
|
Sorry, but is this now acceptable? It is so long time ago I touched this. |
mcollina
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lgtm
Co-authored-by: Gürgün Dayıoğlu <[email protected]> Signed-off-by: Aras Abbasi <[email protected]>
|
Oh, ok. Should be ready now ;) |
gurgunday
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lgtm
While working at fastify-compress I realized, that our default content type parser for json is not returning fastify errors.
See the code in fastify/fastify-compress#342
This also avoids, that we get a potential reflection attack.