Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Freezing weights and fine-tuning MPNN model #117

Open
Rhys-McAlister opened this issue Nov 14, 2023 · 7 comments
Open

Freezing weights and fine-tuning MPNN model #117

Rhys-McAlister opened this issue Nov 14, 2023 · 7 comments

Comments

@Rhys-McAlister
Copy link

Hi, I would like to utilise transfer learning similar to how it is outlined in this post: https://keras.io/guides/transfer_learning/ and was wondering if there were any examples on freezing the weights of the MPNN component of the model whilst allowing the FFNN component to be trained with a different dataset as in "Predicting Infrared Spectra with Message Passing Neural Networks" paper.

Thanks

@PatReis
Copy link
Collaborator

PatReis commented Nov 16, 2023

Hello, from what I know, GNNs are somewhat tricky for transfer learning. But for related tasks this can work.
The guide of keras should also transfer to kgcnn without any difference. You would just change the outplut_mlp then.
If you want I can try to make an example notebook next week or so.

@PatReis
Copy link
Collaborator

PatReis commented Nov 16, 2023

The current kgcnn 4.0 does not have the classic MPNN yet. Would any other GCN work as well?

@Rhys-McAlister
Copy link
Author

Hello, from what I know, GNNs are somewhat tricky for transfer learning. But for related tasks this can work. The guide of keras should also transfer to kgcnn without any difference. You would just change the outplut_mlp then. If you want I can try to make an example notebook next week or so.

Yes an example notebook would be amazing thank you.

The current kgcnn 4.0 does not have the classic MPNN yet. Would any other GCN work as well?

The directed version of MPNN that is implemented in the literature section would be great. Once again thank you so much.

@thegodone
Copy link
Contributor

thegodone commented Nov 17, 2023 via email

@Rhys-McAlister
Copy link
Author

Analyzing Learned Molecular Representations for Property Predictionpubs.acs.orgDMPNN is implemented. It is I readme file and here kgcnn/literature/DMPNN/_model.pyI use itBestGuillaume Envoyé de mon iPhoneLe 16 nov. 2023 à 23:42, Rhys-McAlister @.> a écrit : Hello, from what I know, GNNs are somewhat tricky for transfer learning. But for related tasks this can work. The guide of keras should also transfer to kgcnn without any difference. You would just change the outplut_mlp then. If you want I can try to make an example notebook next week or so. Yes an example notebook would be amazing thank you. The current kgcnn 4.0 does not have the classic MPNN yet. Would any other GCN work as well? The directed version of MPNN that is implemented in the literature section would be great. Once again thank you so much. —Reply to this email directly, view it on GitHub, or unsubscribe.You are receiving this because you are subscribed to this thread.Message ID: @.>

Hi, thanks for the reply. Use I have been using that model that has been implemented here, I just meant that I was wondering if it was possible to freeze the weights of that model or if it was possible to have a tutorial notebook on how to do so.

Cheers,
Rhys

@PatReis
Copy link
Collaborator

PatReis commented Nov 24, 2023

Hello, I added an example with freezing weights: https://github.com/aimat-lab/gcnn_keras/blob/master/notebooks/example_transfer_learning.ipynb
In principle you should also be able to make a new model by glueing two models together (which is not part of this notebook).

@Rhys-McAlister
Copy link
Author

Thank you so much, the notebook looks excellent.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants