-
-
Notifications
You must be signed in to change notification settings - Fork 370
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Encoding runs in worst-case quadratic time #45
Comments
Perhaps you could make a PR implementing that solution :) |
If you want a fast implementation: https://github.com/Vurv78/qjson.lua Runs thousands of times faster than this repo |
Refer to this comment by me. TL;DR: Your JSON parser is broken and has a time complexity issue as well. Your performance claims in their current form can't be taken seriously. |
Both of these have been addressed. Thanks for bringing it to my attention (also see my other comment) |
Consider the following benchmark which encodes the deeply nested structure
[1000, [999, [998, [...]]]]
:this should run in a fraction of a second. Instead, it takes ~2 seconds on my device. This is because string concatenations / recursive
table.concat
are internally used to encode the JSON:This means that the outer loop will have to add the
[
and]
brackets in time linear intable.concat(res, ",")
, which in our example will be just marginally shorter, which in turn will only differ in length by a constant. Thus you do a linear time operation linearly often, resulting in worst-case quadratic encoding runtime in the depth of the nested structure.The proper fix is to use a shared rope and have all recursive calls write to the same rope. You then only concatenate strings once at the end. Building the string then runs in linear time. You can find an example of this approach in my JSON implementation.
The text was updated successfully, but these errors were encountered: