You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have observed that in the code, the model's scene.gaussians.train() and gaussians.optimizer.step() are both performed under the state of with torch.no_grad():. Will this not prevent the parameters from being updated? Can parameters be updated solely relying on loss.backward()?
The text was updated successfully, but these errors were encountered:
I have observed that in the code, the model's
scene.gaussians.train()
andgaussians.optimizer.step()
are both performed under the state ofwith torch.no_grad():
. Will this not prevent the parameters from being updated? Can parameters be updated solely relying onloss.backward()
?The text was updated successfully, but these errors were encountered: