Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

关于model.py中forward函数对公式 ReLU(W ∑r +b) 的具体实现 #11

Open
gbchang94 opened this issue Jun 13, 2021 · 0 comments
Open

Comments

@gbchang94
Copy link

   作者您好,我在阅读model.py文件的forward()函数时,发现对象Model在实现论文中 ReLU(W ∑r +b) 公式时,使用的是以下顺序的代码:

act1 = self.activation(drop1) # ReLU
l = self.Linear(act1) # Linear
这样的顺序,按照我的理解,应该是对 ∑r 进行了一次ReLU,再进行 W · ReLU( ∑r ) + b 的操作,与论文实际所描述的公式不符,但却得到了论文中实验结果所展示的准确率,而我调换这两行代码的顺序之后,准确率降到了80%左右,对此感到很疑惑,能否请您指点一下我的思考是不是存在什么漏洞,确实没想明白这个环节,感谢!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant