Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Conversation

@Uzlopak
Copy link
Contributor

@Uzlopak Uzlopak commented Jan 7, 2025

While working at fastify-compress I realized, that our default content type parser for json is not returning fastify errors.

See the code in fastify/fastify-compress#342

This also avoids, that we get a potential reflection attack.

@github-actions github-actions bot added documentation Improvements or additions to documentation typescript TypeScript related labels Jan 7, 2025
@Uzlopak
Copy link
Contributor Author

Uzlopak commented Jan 7, 2025

How about avoiding creating stacktraces like this?

  function defaultJsonParser (req, body, done) {
    const stackTraceLimit = Error.stackTraceLimit
    Error.stackTraceLimit = 0
    try {
      if (body === '' || body == null || (Buffer.isBuffer(body) && body.length === 0)) {
        return done(new FST_ERR_CTP_EMPTY_JSON_BODY(), undefined)
      }
      const json = parse(body, parserOptions)
      done(null, json)
    } catch {
      return done(new FST_ERR_CTP_INVALID_JSON_BODY(), undefined)
    } finally {
      Error.stackTraceLimit = stackTraceLimit
    }
  }

Copy link
Member

@mcollina mcollina left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm

@Uzlopak
Copy link
Contributor Author

Uzlopak commented Jan 8, 2025

@mcollina
@metcoder95
Just for completeness: This could slow down the response time by fastify on invalid json payloads by about 90%! I would actually recommend to think about my proposol above about avoiding the stack traces.

@metcoder95
Copy link
Member

Just for completeness: This could slow down the response time by fastify on invalid json payloads by about 90%! I would actually recommend to think about my proposol above about avoiding the stack traces.

What's the performance difference with the stack trace limitation?

@Uzlopak
Copy link
Contributor Author

Uzlopak commented Jan 8, 2025

Benchmark case:

'use strict'

const fastify = require('../../fastify')({
  logger: false
})

fastify
  .post('/', function (req, reply) {
    reply
      .send({ hello: req.body.hello })
  })

fastify.listen({ port: 3000 }, (err, address) => {
  if (err) throw err
})

sending valid json

autocannon -m POST -H content-type=application/json -b '{"hello":"world"}' 127.0.0.1:3000
Running 10s test @ http://127.0.0.1:3000
10 connections


┌─────────┬──────┬──────┬───────┬──────┬─────────┬─────────┬───────┐
│ Stat    │ 2.5% │ 50%  │ 97.5% │ 99%  │ Avg     │ Stdev   │ Max   │
├─────────┼──────┼──────┼───────┼──────┼─────────┼─────────┼───────┤
│ Latency │ 0 ms │ 0 ms │ 0 ms  │ 1 ms │ 0.02 ms │ 0.21 ms │ 30 ms │
└─────────┴──────┴──────┴───────┴──────┴─────────┴─────────┴───────┘
┌───────────┬─────────┬─────────┬─────────┬─────────┬─────────┬──────────┬─────────┐
│ Stat      │ 1%      │ 2.5%    │ 50%     │ 97.5%   │ Avg     │ Stdev    │ Min     │
├───────────┼─────────┼─────────┼─────────┼─────────┼─────────┼──────────┼─────────┤
│ Req/Sec   │ 14,583  │ 14,583  │ 23,055  │ 23,535  │ 22,300  │ 2,465.92 │ 14,583  │
├───────────┼─────────┼─────────┼─────────┼─────────┼─────────┼──────────┼─────────┤
│ Bytes/Sec │ 2.74 MB │ 2.74 MB │ 4.34 MB │ 4.42 MB │ 4.19 MB │ 463 kB   │ 2.74 MB │
└───────────┴─────────┴─────────┴─────────┴─────────┴─────────┴──────────┴─────────┘

Req/Bytes counts sampled once per second.
# of samples: 11

245k requests in 11.01s, 46.1 MB read

sending invalid json on main branch:

aras@aras-HP-ZBook-15-G3:~/workspace/fastify$ autocannon -m POST -H content-type=application/json -b '{"hello":"world"' 127.0.0.1:3000
Running 10s test @ http://127.0.0.1:3000
10 connections


┌─────────┬──────┬──────┬───────┬──────┬─────────┬─────────┬───────┐
│ Stat    │ 2.5% │ 50%  │ 97.5% │ 99%  │ Avg     │ Stdev   │ Max   │
├─────────┼──────┼──────┼───────┼──────┼─────────┼─────────┼───────┤
│ Latency │ 0 ms │ 0 ms │ 1 ms  │ 1 ms │ 0.09 ms │ 0.34 ms │ 18 ms │
└─────────┴──────┴──────┴───────┴──────┴─────────┴─────────┴───────┘
┌───────────┬─────────┬─────────┬─────────┬─────────┬─────────┬──────────┬─────────┐
│ Stat      │ 1%      │ 2.5%    │ 50%     │ 97.5%   │ Avg     │ Stdev    │ Min     │
├───────────┼─────────┼─────────┼─────────┼─────────┼─────────┼──────────┼─────────┤
│ Req/Sec   │ 11,007  │ 11,007  │ 17,039  │ 17,279  │ 16,420  │ 1,730.68 │ 11,002  │
├───────────┼─────────┼─────────┼─────────┼─────────┼─────────┼──────────┼─────────┤
│ Bytes/Sec │ 3.48 MB │ 3.48 MB │ 5.39 MB │ 5.46 MB │ 5.19 MB │ 547 kB   │ 3.48 MB │
└───────────┴─────────┴─────────┴─────────┴─────────┴─────────┴──────────┴─────────┘

Req/Bytes counts sampled once per second.
# of samples: 11

0 2xx responses, 180634 non 2xx responses
181k requests in 11.01s, 57.1 MB read

this branch

aras@aras-HP-ZBook-15-G3:~/workspace/fastify$ autocannon -m POST -H content-type=application/json -b '{"hello":"world"' 127.0.0.1:3000
Running 10s test @ http://127.0.0.1:3000
10 connections


┌─────────┬──────┬──────┬───────┬──────┬─────────┬─────────┬───────┐
│ Stat    │ 2.5% │ 50%  │ 97.5% │ 99%  │ Avg     │ Stdev   │ Max   │
├─────────┼──────┼──────┼───────┼──────┼─────────┼─────────┼───────┤
│ Latency │ 0 ms │ 0 ms │ 1 ms  │ 1 ms │ 0.18 ms │ 0.47 ms │ 30 ms │
└─────────┴──────┴──────┴───────┴──────┴─────────┴─────────┴───────┘
┌───────────┬─────────┬─────────┬─────────┬─────────┬───────────┬─────────┬─────────┐
│ Stat      │ 1%      │ 2.5%    │ 50%     │ 97.5%   │ Avg       │ Stdev   │ Min     │
├───────────┼─────────┼─────────┼─────────┼─────────┼───────────┼─────────┼─────────┤
│ Req/Sec   │ 8,111   │ 8,111   │ 13,423  │ 14,031  │ 12,879.82 │ 1,588.2 │ 8,110   │
├───────────┼─────────┼─────────┼─────────┼─────────┼───────────┼─────────┼─────────┤
│ Bytes/Sec │ 2.77 MB │ 2.77 MB │ 4.58 MB │ 4.78 MB │ 4.39 MB   │ 541 kB  │ 2.77 MB │
└───────────┴─────────┴─────────┴─────────┴─────────┴───────────┴─────────┴─────────┘

Req/Bytes counts sampled once per second.
# of samples: 11

0 2xx responses, 141664 non 2xx responses
142k requests in 11.01s, 48.3 MB rea

with my suggested change:

function getDefaultJsonParser (onProtoPoisoning, onConstructorPoisoning) {
  const parse = secureJson.parse
  const parseOptions = { protoAction: onProtoPoisoning, constructorAction: onConstructorPoisoning }

  return defaultJsonParser

  function defaultJsonParser (req, body, done) {
    const stackTraceLimit = Error.stackTraceLimit
    Error.stackTraceLimit = 0
    try {
      if (body === '' || body == null || (Buffer.isBuffer(body) && body.length === 0)) {
        return done(new FST_ERR_CTP_EMPTY_JSON_BODY(), undefined)
      }
      const json = parse(body, parseOptions)
      done(null, json)
    } catch {
      return done(new FST_ERR_CTP_INVALID_JSON_BODY(), undefined)
    } finally {
      Error.stackTraceLimit = stackTraceLimit
    }
  }
}
aras@aras-HP-ZBook-15-G3:~/workspace/fastify$ autocannon -m POST -H content-type=application/json -b '{"hello":"world"' 127.0.0.1:3000
Running 10s test @ http://127.0.0.1:3000
10 connections


┌─────────┬──────┬──────┬───────┬──────┬────────┬─────────┬───────┐
│ Stat    │ 2.5% │ 50%  │ 97.5% │ 99%  │ Avg    │ Stdev   │ Max   │
├─────────┼──────┼──────┼───────┼──────┼────────┼─────────┼───────┤
│ Latency │ 0 ms │ 0 ms │ 1 ms  │ 1 ms │ 0.1 ms │ 0.34 ms │ 20 ms │
└─────────┴──────┴──────┴───────┴──────┴────────┴─────────┴───────┘
┌───────────┬─────────┬─────────┬─────────┬─────────┬──────────┬──────────┬─────────┐
│ Stat      │ 1%      │ 2.5%    │ 50%     │ 97.5%   │ Avg      │ Stdev    │ Min     │
├───────────┼─────────┼─────────┼─────────┼─────────┼──────────┼──────────┼─────────┤
│ Req/Sec   │ 10,951  │ 10,951  │ 17,039  │ 17,375  │ 16,289.1 │ 1,737.52 │ 10,944  │
├───────────┼─────────┼─────────┼─────────┼─────────┼──────────┼──────────┼─────────┤
│ Bytes/Sec │ 3.73 MB │ 3.73 MB │ 5.81 MB │ 5.92 MB │ 5.55 MB  │ 592 kB   │ 3.73 MB │
└───────────┴─────────┴─────────┴─────────┴─────────┴──────────┴──────────┴─────────┘

Req/Bytes counts sampled once per second.
# of samples: 11

0 2xx responses, 179183 non 2xx responses
179k requests in 11.01s, 61.1 MB read

Alternative:

const stackTraceLimit = Error.stackTraceLimit
Error.stackTraceLimit = 0
const defaultInvalidJSONError = new FST_ERR_CTP_INVALID_JSON_BODY()
Error.stackTraceLimit = stackTraceLimit

function getDefaultJsonParser (onProtoPoisoning, onConstructorPoisoning) {
  const parse = secureJson.parse
  const parseOptions = { protoAction: onProtoPoisoning, constructorAction: onConstructorPoisoning }

  return defaultJsonParser

  function defaultJsonParser (req, body, done) {
    if (body === '' || body == null || (Buffer.isBuffer(body) && body.length === 0)) {
      return done(new FST_ERR_CTP_EMPTY_JSON_BODY(), undefined)
    }
    let json
    try {
      json = parse(body, parseOptions)
    } catch (err) {
      err.code = 'FST_ERR_CTP_INVALID_JSON_BODY'
      err.statusCode = 400
      err.message = "Body is not valid JSON but content-type is set to 'application/json'"
      Object.setPrototypeOf(err, defaultInvalidJSONError)
      return done(err, undefined)
    }
    done(null, json)
  }
}
aras@aras-HP-ZBook-15-G3:~/workspace/fastify$ autocannon -m POST -H content-type=application/json -b '{"hello":"world"' 127.0.0.1:3000
Running 10s test @ http://127.0.0.1:3000
10 connections


┌─────────┬──────┬──────┬───────┬──────┬─────────┬─────────┬───────┐
│ Stat    │ 2.5% │ 50%  │ 97.5% │ 99%  │ Avg     │ Stdev   │ Max   │
├─────────┼──────┼──────┼───────┼──────┼─────────┼─────────┼───────┤
│ Latency │ 0 ms │ 0 ms │ 1 ms  │ 1 ms │ 0.11 ms │ 0.36 ms │ 20 ms │
└─────────┴──────┴──────┴───────┴──────┴─────────┴─────────┴───────┘
┌───────────┬─────────┬─────────┬─────────┬─────────┬───────────┬─────────┬─────────┐
│ Stat      │ 1%      │ 2.5%    │ 50%     │ 97.5%   │ Avg       │ Stdev   │ Min     │
├───────────┼─────────┼─────────┼─────────┼─────────┼───────────┼─────────┼─────────┤
│ Req/Sec   │ 9,639   │ 9,639   │ 16,607  │ 17,055  │ 16,014.91 │ 2,031.5 │ 9,633   │
├───────────┼─────────┼─────────┼─────────┼─────────┼───────────┼─────────┼─────────┤
│ Bytes/Sec │ 3.28 MB │ 3.28 MB │ 5.66 MB │ 5.82 MB │ 5.46 MB   │ 694 kB  │ 3.28 MB │
└───────────┴─────────┴─────────┴─────────┴─────────┴───────────┴─────────┴─────────┘

Req/Bytes counts sampled once per second.
# of samples: 11

0 2xx responses, 176182 non 2xx responses
176k requests in 11.01s, 60.1 MB read

Copy link
Member

@gurgunday gurgunday left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please don't merge before tonight, I'd like to take a look as well

@Uzlopak
Copy link
Contributor Author

Uzlopak commented Jan 8, 2025

yeah, dont worry.

@metcoder95
Copy link
Member

with my suggested change:

I'm ok with the suggestion; secure-json-parse seems to already remove stack trace upon parsing so the regression seems to be mostly due to error generation.

I'd say that possibly is not a hot path per-se, so the regression from your stack trace suggestion should be fine.

Copy link
Member

@gurgunday gurgunday left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Really interesting alternatives! Even if it has more code, I think we should mirror secure-json-parse and not generate stack traces for the validation errors. But this version might be problematic as during done, potentially when user code is running, stack traces will still be disabled I believe? We'd need to turn it off just before the error creation and turn it back on right after

function getDefaultJsonParser (onProtoPoisoning, onConstructorPoisoning) {
  const parse = secureJson.parse
  const parseOptions = { protoAction: onProtoPoisoning, constructorAction: onConstructorPoisoning }

  return defaultJsonParser

  function defaultJsonParser (req, body, done) {
    const stackTraceLimit = Error.stackTraceLimit
    Error.stackTraceLimit = 0
    try {
      if (body === '' || body == null || (Buffer.isBuffer(body) && body.length === 0)) {
        return done(new FST_ERR_CTP_EMPTY_JSON_BODY(), undefined)
      }
      const json = parse(body, parseOptions)
      done(null, json)
    } catch {
      return done(new FST_ERR_CTP_INVALID_JSON_BODY(), undefined)
    } finally {
      Error.stackTraceLimit = stackTraceLimit
    }
  }
}

Copy link
Member

@climba03003 climba03003 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am thinking is it better to do with fastify-error and provides an option to hide stack trace.

// per-error
createError(code, message [, statusCode [, Base]] [,hideStackTrace])

// globally
const createError = require('@fastify/error')
createError.hideStackTrace = true

The global option is used to hide stack trace for all fastify error in production.
The per-error option is used to hide or unhide stack trace where we think necessary.

The global option is needed because we always provide the detail of stack trace by default.
It would be better to hide the detail in production mode.

@gurgunday
Copy link
Member

I am thinking is it better to do with fastify-error and provides an option to hide stack trace.

// per-error
createError(code, message [, statusCode [, Base]] [,hideStackTrace])

// globally
const createError = require('@fastify/error')
createError.hideStackTrace = true

The global option is used to hide stack trace for all fastify error in production. The per-error option is used to hide or unhide stack trace where we think necessary.

The global option is needed because we always provide the detail of stack trace by default. It would be better to hide the detail in production mode.

LGTM!

@Uzlopak
Copy link
Contributor Author

Uzlopak commented Jan 9, 2025

How should we move on?

@metcoder95
Copy link
Member

If @climba03003's suggestion has the same benefits, I'd say that his approach will be a good way to move forward

@Eomm Eomm added the bugfix Issue or PR that should land as semver patch label Jan 18, 2025
@gurgunday
Copy link
Member

@Uzlopak would you like to rebase and finish this up?

I think we can just set captureStackTrace to false for all validation errors?

@Uzlopak
Copy link
Contributor Author

Uzlopak commented Jun 4, 2025

@gurgunday

Sorry, but is this now acceptable? It is so long time ago I touched this.

Copy link
Member

@mcollina mcollina left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm

Co-authored-by: Gürgün Dayıoğlu <[email protected]>
Signed-off-by: Aras Abbasi <[email protected]>
@Uzlopak Uzlopak requested a review from gurgunday June 29, 2025 12:28
@Uzlopak
Copy link
Contributor Author

Uzlopak commented Jun 29, 2025

Oh, ok. Should be ready now ;)

Copy link
Member

@gurgunday gurgunday left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm

@gurgunday gurgunday merged commit c277b9f into main Jul 20, 2025
36 checks passed
@gurgunday gurgunday deleted the invalid-json-body branch July 20, 2025 17:36
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bugfix Issue or PR that should land as semver patch documentation Improvements or additions to documentation typescript TypeScript related

Projects

None yet

Development

Successfully merging this pull request may close these issues.

7 participants