-
Notifications
You must be signed in to change notification settings - Fork 290
Simplify JSON_INT_MAX #102
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Seems alright to me, though I wonder if the |
I agree
Indeed, the Fixed by pull request #109. |
Header Indeed, |
Also added contributor's name to author file.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hm, I tried merging this but it breaks the build due to mismatched sign comparisons, the original definition of the macro was signed whereas this new macro is unsigned. Is it safe to just add a cast back to json_int_t
?
Also, if you have a preference about your name being in the AUTHORS
file, now would be the time to say so.
: (sizeof(json_int_t) == 4 | ||
? INT32_MAX | ||
: INT64_MAX)); | ||
#define JSON_INT_MAX ((1UL << ((sizeof(json_int_t) * CHAR_BIT) - 1)) - 1) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For a patch release such as 1.1.1, I would change as little code as possible:
static const json_int_t JSON_INT_MAX = (json_int_t)((1UL << ((sizeof(json_int_t) * CHAR_BIT) - 1)) - 1);
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also, 1UL
is an unsigned long
. What if json_int_t
is larger than long
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That's a very good point, perhaps the alternative you proposed in #84 would be better then
Either it has been defined explictly, or we attempt to calculate it using the method reverted by fcfa748. See discussion in json-parser#84 and json-parser#102.
Either it has been defined explictly, or we attempt to calculate it using the method reverted by fcfa748. See discussion in json-parser#84 and json-parser#102.
Either it has been defined explictly, or we attempt to calculate it using the method reverted by fcfa748. See discussion in json-parser#84 and json-parser#102.
Either it has been defined explictly, or we attempt to calculate it using the method reverted by fcfa748. See discussion in json-parser#84 and json-parser#102.
Either it has been defined explictly, or we attempt to calculate it using the method reverted by fcfa748. See discussion in json-parser#84 and json-parser#102.
Either it has been defined explictly, or we attempt to calculate it using the method reverted by fcfa748. See discussion in json-parser#84 and json-parser#102.
Either it has been defined explictly, or we attempt to calculate it using the method reverted by fcfa748. See discussion in json-parser#84 and json-parser#102.
Either it has been defined explictly, or we attempt to calculate it using the method reverted by fcfa748. See discussion in json-parser#84 and json-parser#102.
Fixes json-parser#84 Closes json-parser#102 Probably affects json-parser#115 In theory this should be guaranteed to work regardless of the underlying representation: https://stackoverflow.com/a/2711560/1959975 Demos for C++17 and C89: https://gcc.godbolt.org/#z:OYLghAFBqd5QCxAYwPYBMCmBRdBLAF1QCcAaPECAMzwBtMA7AQwFtMQByARg9KtQYEAysib0QXACx8BBAKoBnTAAUAHpwAMvAFYTStJg1DIApACYAQuYukl9ZATwDKjdAGFUtAK4sGe1wAyeAyYAHI%2BAEaYxBIAHKQADqgKhE4MHt6%2BekkpjgJBIeEsUTFc8XaYDmlCBEzEBBk%2Bfly2mPZ5DDV1BAVhkdFxtrX1jVktCsM9wX3FA2UAlLaoXsTI7BzmAMxYNCEA1ABq2ABKAJIAYgCaAPpuAIIBbhDaCgLXwQTXBKR7DPN7LzeHy%2Bvy2W2uAFk7gANCD/EwAdgse2ImAIKwYe2erwY70EX3mUC8DBSwBC6ABOLxnwI8wAtFw9gB6LHE0nkylA/G0szzEybZGIgAiJg0d1F4rMmzwVB2e2u12QCW8CmVXgUEq2wWQ3iwe35bloeBYhA1m2wmqlO2mhxOF0u2K5NJ%2Bf31AFY3BK9t7ObjgQRQVLwSwmKo4fqkSi0Ri9hN0CAQAwfNE8MhrkaTQQzW5AX7ufzsAmQ2G%2BQKI0L3Z6xT7bWcrrcHk9c9Svi6%2BWKtm0lJbtphdpha/bB/X7o8e64ZZqxUc6w7kAg6j953VS1Zp3arhAFAgSN9Yzv6quJTP7RAPj8Pkf17OILQBMAfnejFe7ifN0/gHsP4/7x%2BX2%2BHQVD4uDMeI9mA0CXw4RZaE4N1eD8DgtFIVBOE9SxrFjZZVgHLYeFIAhNGgxYAGsQE2AA2AA6LhaK4CiNA0Lg3ViN1NjKfROEkBCiJQzheAUEANAIojFjgWAkDQFgEjoaJyEoKSZPoGJgFosw%2BDoAhokEiAIl4iJgjqABPTh8KkthBAAeQYWgTKQ3gsBDIxxHs0h8FRKoADdMEE1zMFUSovC00zeA%2BNpeKNCJiGMjwsF4ghiGNELFioAxgAUA48EwAB3SyEkYEKZEEEQxHYKQivkJQ1F43QWgMIwUGsax9DwCJBMgRZUASDpfLpON%2BSFMkvCaywuARPY6UszYJoAdTEWhZv8hKmEW4hiBIAS2kqDoXAYdxPCafw9t6IoSmyZJUgEUZmkSC6OhO/pSladpqkma69AqKoBC6eoHtmJ6Jm6d7xkmP6zq4RZXhWNY9ASzB1h4GC4J41zUI4VRYgoukKMkPZgGQZA9loqizCxdCrEsH5cEIEh9SlFo9g8aTZOIOn2PmXhCPs%2BZSPIjQqLdQW3Q0CiKOF2IuE2YWKM4jhuNIFgJEY0hEOQtGBKEkTudIcTEBQVBmeU%2BSIEUlmQFUkCNNoLTiB0vTXIM5hiDssyDYsghrNs3jHMMYAXOQ9ztrwbzfOQ/zAuC7hQsEcLXMi6Lndi9ZkISpKo5StKMqy3L8sQ/D%2BGK0RxHKgvKpUdRXN0dT6uMEabEi9q4RQ7q0l6/rNkGpNYgATgmqbZvmxbVGW1b1tZulVD2ZBNpe5wIFcYHSECaZTrmW7cjSReckuhgwbXz6Oh%2BhoDrGZ6g%2B%2B0GV8ej63pPm7Ad%2Bq//okSHsJhlo4YR6DZfglXeLRjGWMcZ4wJkTLgJMyZ1ypvgIgrM8I/CZkpaIbNeSc1ErzTYmwqKYJwbgvBst5aq14OrWwmsuZaB5rLMwKM1b8S1hQxY3lbZpBAJIIAA%3D%3D
Thank you for this, however after further research and consideration I believe #115 offers a better solution for this, so I'm closing this for now. |
Either it has been defined explictly, or we attempt to calculate it using the method reverted by fcfa748. See discussion in json-parser#84 and json-parser#102.
Simplifies the JSON_INT_MAX expression.
Thanks to @LB-- for suggesting to use CHAR_INT instead of 8 (char type is defined as at least 8 bits).