-
Notifications
You must be signed in to change notification settings - Fork 201
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix silent fail if network loads wrong weights #1521
Fix silent fail if network loads wrong weights #1521
Conversation
Signed-off-by: Matthias Hadlich <[email protected]>
Signed-off-by: Matthias Hadlich <[email protected]>
Signed-off-by: Matthias Hadlich <[email protected]>
d82dea1
to
c3c9d83
Compare
for more information, see https://pre-commit.ci
@tangy5 Can you shortly check if this PR is fine so far? Then I would add |
It looks good to me, Thanks. @SachidanandAlle @diazandr3s , do we have further concerns if we set load_strict to true here by default? Im thinking it should be OK, but I'm not sure about all of our pretrained model weights. Thanks |
@@ -57,7 +57,7 @@ def __init__( | |||
output_label_key: str = "pred", | |||
output_json_key: str = "result", | |||
config: Union[None, Dict[str, Any]] = None, | |||
load_strict: bool = False, | |||
load_strict: bool = True, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
load_strict: bool = True, | |
load_strict: bool = False, |
lets keep the base class value same.. lets not change.. we override it to true in all derived classes
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Then only the warning?
My idea behind that was that this makes adding new models harder. Which is why I proposed changing the default to True and setting the value to False in all the existing models.
In my case the network would not load any weights at all but MONAILabel would not tell me. The state dict was completely different but I did not even get a warning. I was thus wondering for a few hours why a perfectly fine network wouldn't work in MONAILabel...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In that case can you verify all sample apps..
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
How do you mean that? If you mean that that would imply changing all the current sample apps, then I agree, yes. That's why I was asking as well if this change is desired at all. Imo it is worth the effort since it makes creating new models easier and less prone to weird errors
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@SachidanandAlle Can you explain what you mean?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
to keep this flag=True
, we need to make sure existing pre-trained models can load with this flag.
radiology/pathology and endoscopy apps/models. if some can't then add flag=False
for those models for which it fails.
otherwise it will break the release workflows.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@SachidanandAlle Oh I'm sorry, I now get what you meant initially. What I had planned was to add that flag in the step, when you guys generally approve of the changes, which I was not sure of.
However you already merged the branch now.
Nevertheless I would still add the flag=False
for all existing models or do you want to do that based on feedback what breaks?
Not sure if I can execute all the models, so I would simply add it to all current models.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I will add that today and update this PR / open another one if I can't do that. It might break some preexisting code otherwise.
I will merge this for now.. but better this gets verified for other existing example models/apps |
This is a draft for fixing #1508 . It sets load_strict to False in basic_infer.py, also adds a warning in case load_strict is False but the model does not contain all the necessary keys.
Most importanly this PR helps the creation of new models where one might not even realize that the loaded network does not contain the necessary keys, which is what happened to me.
For default the DeepEdit the warning looks like this:
It is a bit hidden, but nevertheless if anyone where to debug the code, this should probably help.
Todo: