You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I modified a json export with another tool (jq) and was unable to read it back into flowpipeline.
In the first attempt, jq pretty-printed the json with newlines between each json field (not a full flowpipeline object), which lead to a failure decoding the json.
In the second attempt I instructed jq to "uglify" the json, which removes all newlines. But then the import fails because the file contains one very long line, leading to a "buffer too big" error.
Both errors are caused by the input segment using a scanner and always processing the data line-by-line. This however is only successful for a subset of valid json files.
The text was updated successfully, but these errors were encountered:
I modified a json export with another tool (jq) and was unable to read it back into flowpipeline.
In the first attempt, jq pretty-printed the json with newlines between each json field (not a full flowpipeline object), which lead to a failure decoding the json.
In the second attempt I instructed jq to "uglify" the json, which removes all newlines. But then the import fails because the file contains one very long line, leading to a "buffer too big" error.
Both errors are caused by the input segment using a scanner and always processing the data line-by-line. This however is only successful for a subset of valid json files.
The text was updated successfully, but these errors were encountered: