We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
extended_attention_mask = extended_attention_mask.to(dtype=next(self.parameters()).dtype) # fp16 compatibility 卡在这语句上,感觉应该是维度不匹配的问题,应该怎么修改啊?
The text was updated successfully, but these errors were encountered:
我也卡在这了,你解决了吗
Sorry, something went wrong.
解决了。法1:降级pytorch==1.4.0 法2:extended_attention_mask = extended_attention_mask.to(dtype=next(self.parameters()).dtype) # fp16 compatibility 改成extended_attention_mask = extended_attention_mask.to(dtype=torch.float32) # fp16 compatibility 就是报错的最后一个。 服务器的话,要在服务器上找到那个环境下改,我就是在本地改的,一直没映射上去
No branches or pull requests
extended_attention_mask = extended_attention_mask.to(dtype=next(self.parameters()).dtype) # fp16 compatibility
卡在这语句上,感觉应该是维度不匹配的问题,应该怎么修改啊?
The text was updated successfully, but these errors were encountered: