-
Notifications
You must be signed in to change notification settings - Fork 14
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Memory consumed for a larger dataset is not freed post vegalite_to_png call #143
Comments
vl-convert-clip.movThe video is clipped in between to fit 10MB limit for github. |
Hi @manurampandit, thanks for the report. What version of vl-convert-python do you have installed?
|
I had tried this on |
Yeah, try the latest. There was a memory leak fix that went into 0.14.0, so that's not the issue. We have updated the JavaScript runtime since 0.14.0, so there's a chance that will help. |
I tried this with latest |
I think the reason this example is so slow to render is that the x-axis is a string with type nominal, so we end up needing to draw a tick label for every row of the dataset. Was this your intention? If not, it might be helpful to create an example with a large dataset that doesn't need to render so much text. |
I understand the slowness reason (data not in the right format) which I expect some user may accidentally do. My primary intention is to get rid of the memory leak. As I mentioned earlier there is a workaround for now - invoking in a separate process. I tried looking at GIL lock but not much experience into that. Would love to contribute back but need some pointer to look into this. |
I did play with this a bit, and wasn't able to track down where the memory is being held. What's notable is that, from my testing, this isn't a leak as repeated usage doesn't keep increasing the memory consumption. But the memory used doesn't reduce after each usage has completed. This may suggest there's a cache somewhere, or that the V8 JavaScript runtime isn't performing garbage collection after each usage. I think a next step would be to see if we can reproduce this using the |
Agree with the behavioral part that the memory consumed remains almost constant (after first time). I'll try to segregate the issue by calling directly with |
Playing with this a little more, and I think the cause may be that the memory allocator the Rust code is using is not releasing the memory back to the OS. What operating system are you using? |
On MacOS, I'm seeing that a lot of memory is released around 3 minutes after the increase due to the image export. Could you check if you see this behavior as well? I'm not sure if there's anything we can do to speed this up. |
Issue
During the process of converting a vegalite spec to png image from
vegalite_to_png
a memory leak was observed.Issue description
The overall memory inside the container increased and then settled around that i.e. it did not reduced. Have tried explicit gc collect for the variables used but not much helpful.
As this behavior is specific to particular kind of data, so including a script to randomly generate similar data.
The steps to reproduce mentions the two steps required. Will see if I can link the video recording showing overall memory foot print increase.
Guess is that GIL is not freeing up the memory held during the execution. (or some leakage there)
Steps to reproduce:
vegalite_to_png
Temporary hack to get rid of this.
This code overcome the memory problem using a separate process wherein the overall memory cleanup is done by gc (for a different process) while in the default case, the memory is leaked to the external environment.
The text was updated successfully, but these errors were encountered: