Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

loss的代码关于batch size的处理有bug。 #23

Open
echoht opened this issue May 5, 2023 · 4 comments
Open

loss的代码关于batch size的处理有bug。 #23

echoht opened this issue May 5, 2023 · 4 comments

Comments

@echoht
Copy link

echoht commented May 5, 2023

企业微信截图_16832967678761
这里batch size设置为1时,逻辑没有问题。当batch size!=1时,会出现tensor size不match的情况。

@GanjinZero
Copy link
Owner

batch size不等于1的时候 一般都会爆显存,所以就没有实现batch size不等于1的情况;感兴趣的话可以自己改一下

@echoht
Copy link
Author

echoht commented May 6, 2023

batch size不等于1的时候 一般都会爆显存,所以就没有实现batch size不等于1的情况;感兴趣的话可以自己改一下

026adca 帮忙cr下。

@GanjinZero
Copy link
Owner

下周一我确定代码无误后 再接受pr

@echoht
Copy link
Author

echoht commented May 8, 2023

下周一我确定代码无误后 再接受pr

701aaeb 这个commit哈,是最新的。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants