You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When trying to join two large pandas dataframes, my notebook kept taking forever to run the cell containing the join. It would do some processing, then CPU usage would drop off, but the cell would look as though it was still working: [*]
It worked fine with the same code joining smaller dataframes, and I resolved it by increasing the memory resource allocated to docker. I think it was due to running out of memory, but there was no indication of this within JupyterLab.
When trying to join two large pandas dataframes, my notebook kept taking forever to run the cell containing the join. It would do some processing, then CPU usage would drop off, but the cell would look as though it was still working:
[*]
It worked fine with the same code joining smaller dataframes, and I resolved it by increasing the memory resource allocated to docker. I think it was due to running out of memory, but there was no indication of this within JupyterLab.
It might be useful to be able to show memory usage within JupyterLab. This extension might be a possibility?
https://github.com/jtpio/jupyterlab-system-monitor
The text was updated successfully, but these errors were encountered: