Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

GenesisBuilder: arbitrary_precision feature enabled for serde_json #2987

Merged
merged 1 commit into from
Jan 18, 2024

Conversation

michalkucharczyk
Copy link
Contributor

@michalkucharczyk michalkucharczyk commented Jan 18, 2024

arbitrary_precision feature allows to (de-)serialize big numbers w/o error.
For some details refer also to #1256 (comment)

fixes: #2963

@michalkucharczyk michalkucharczyk requested a review from a team January 18, 2024 14:16
@michalkucharczyk michalkucharczyk added the R0-silent Changes should not be mentioned in any release notes label Jan 18, 2024
Copy link
Contributor

@koute koute left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm.... as an alternative maybe we should consider serializing those numbers as strings instead, if possible?

The reason being that this is not really portable, so anyone who will try to load a JSON containing such numbers in any other language, using any other library, or just using serde_json without this flag will silently get incorrect results (if the library doesn't explicitly produce an error in such case, and I think almost all of them don't?).

@davxy
Copy link
Member

davxy commented Jan 18, 2024

json spec doesn't pose any limit to the number of bits for the number type (https://www.json.org/json-en.html).

As u128 is native to Rust and thus doesn't really need bignum support. I propose to:

  1. accept this
  2. (optional) propose a serde_json patch to exten numbers to u128.

However, I also understand @koute perplexities, i.e. popular libs and other lang implementations typically support up to u64

@michalkucharczyk
Copy link
Contributor Author

michalkucharczyk commented Jan 18, 2024

Hmm.... as an alternative maybe we should consider serializing those numbers as strings instead, if possible?

The reason being that this is not really portable, so anyone who will try to load a JSON containing such numbers in any other language, using any other library, or just using serde_json without this flag will silently get incorrect results (if the library doesn't explicitly produce an error in such case, and I think almost all of them don't?).

Would that means that also 64-bit integers would need to be serialized as String? (https://www.rfc-editor.org/rfc/rfc8259#section-6):

Note that when such software is used, numbers that are integers and
are in the range [-(2^53)+1, (2^53)-1] are interoperable in the
sense that implementations will agree exactly on their numeric
values.

This for sure needs to be solved at some point. Strings would be possible, but that would require touching affected pallets.

@michalkucharczyk
Copy link
Contributor Author

json spec doesn't pose any limit to the number of bits for the number type (https://www.json.org/json-en.html).

As u128 is native to Rust and thus doesn't really need bignum support. I propose to:

  1. accept this
  2. (optional) propose a serde_json patch to exten numbers to u128.

However, I also understand @koute perplexities, i.e. popular libs and other lang implementations typically support up to u64

See discussion here: serde-rs/json#846

FWIS actually serde_json does not support u128 :), that is why we need arbitrary precision.

@koute
Copy link
Contributor

koute commented Jan 18, 2024

Would that means that also 64-bit integers would need to be serialized as String?

For perfect compatibility with everything, yes. (In languages where numbers are f64 there's only 52 bits of mantissa, so you can't encode full 64-bit ints with full precision.)

It is an open question whether we care, but it is a footgun that should be at the very least documented.

@bkchr bkchr added this pull request to the merge queue Jan 18, 2024
@bkchr
Copy link
Member

bkchr commented Jan 18, 2024

It is an open question whether we care, but it is a footgun that should be at the very least documented.

You mean that other languages would fail to parse such a number?

@michalkucharczyk
Copy link
Contributor Author

It is an open question whether we care, but it is a footgun that should be at the very least documented.

You mean that other languages would fail to parse such a number?

It's matter of json lib implementation. Some will do (ruby,python), some not (nodejs). From what I found this c++ json lib nlohmann/json is not able to parse it.

Merged via the queue into master with commit 87927bb Jan 18, 2024
128 of 131 checks passed
@bkchr bkchr deleted the mku-serde-json-arbitrary-precision-enabled branch January 18, 2024 22:10
bgallois pushed a commit to duniter/duniter-polkadot-sdk that referenced this pull request Mar 25, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
R0-silent Changes should not be mentioned in any release notes
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Impossible to create chain spec with large balance
4 participants