Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support the multi-gpu option #1

Open
taliq opened this issue Jun 20, 2021 · 1 comment
Open

Support the multi-gpu option #1

taliq opened this issue Jun 20, 2021 · 1 comment

Comments

@taliq
Copy link

taliq commented Jun 20, 2021

Hi, thanks for the effort and commitment to the scalable GNN, the memory issue really bugs me sometimes.
The autoscale method seems a cool approach to handle the above issue and if it supports the multi-gpu, the training speed is faster than ever!
It seems hard to consider the multi-gpu option, but I ask you all just in case :)

@rusty1s
Copy link
Owner

rusty1s commented Jun 20, 2021

Thanks for your interest and I am glad that you like it. I will try to add multi-GPU support if I have some free-time, but it shouldn't be too hard to add. The major thing to take care of is that replicated models do hold shared histories instead of individual ones, and that synchronization of pushing and pulling histories is synchronized over multiple models as well.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants