You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, I train ACGAN with MNIST dataset. or Any Other Conditional GANs!!!
As your method in ReadMe.md:
from optimizers import ACGD
device = torch.device('cuda:0')
lr = 0.0001
G = Generator()
D = Discriminator()
optimizer = ACGD(max_params=G, min_params=D, lr=lr, device=device)
# ACGD: Adaptive learning rates CGD;
for img in dataloader:
d_real = D(img)
z = torch.randn((batch_size, z_dim), device=device)
d_fake = D(G(z))
loss = criterion(d_real, d_fake)
optimizer.zero_grad()
optimizer.step(loss=loss)
But in CGANs, output of D maybe have Two like [d_real, cond_real] or [d_fake, cond_fake]
They must be compute condition loss , such as using CrossEntropy :
Hi, I train ACGAN with MNIST dataset. or Any Other Conditional GANs!!!
As your method in ReadMe.md:
But in CGANs, output of D maybe have Two like [d_real, cond_real] or [d_fake, cond_fake]
They must be compute condition loss , such as using CrossEntropy :
But the experiment is failed , so Would you provide a Demo for How to use in cGANs
The text was updated successfully, but these errors were encountered: