Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

generate_gt.py memory issue #129

Open
ywm5 opened this issue Mar 23, 2023 · 2 comments
Open

generate_gt.py memory issue #129

ywm5 opened this issue Mar 23, 2023 · 2 comments

Comments

@ywm5
Copy link

ywm5 commented Mar 23, 2023

ray.exceptions.RayTaskError(RayOutOfMemoryError): ray::process_with_single_worker() (pid=17640, ip=127.0.0.1)
File "python\ray_raylet.pyx", line 623, in ray._raylet.execute_task
File "E:\Anaconda\envs\NeuralReconpython39\lib\site-packages\ray_private\memory_monitor.py", line 162, in raise_if_low_memory
raise RayOutOfMemoryError(
ray._private.memory_monitor.RayOutOfMemoryError: More than 95% of the memory on node yanwm5 is used (15.08 / 15.73 GB). The top 10 memory consumers are:

"I want to know how much memory is generate_gt.py required to run, but I don't have enough 16GB of memory."

@royrx93
Copy link

royrx93 commented Apr 13, 2023

I set the n_proc, num_workers, and n_gpu to 1 and it seems to solve the OOM problem on my device.

@bloodhunt3r
Copy link

bloodhunt3r commented May 29, 2023

Looks like ray has memory leak issue and up till now there's no fix yet:

ray-project/ray#25779

you need to wait for ray-2.5.0 release

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants