Convolutional Generator
Define a convolutional generator following the DCGAN guidelines discussed in the last video.
torch.nn
has been pre-imported as nn
for your convenience. Additionally, a custom function dc_gen_block()
is available, which eturns a block of a transposed convolution, batch norm, and ReLU activation. This function serves as a foundational component for constructing the convolutional generator. You can get familiar with dc_gen_block()
's definition below.
def dc_gen_block(in_dim, out_dim, kernel_size, stride):
return nn.Sequential(
nn.ConvTranspose2d(in_dim, out_dim, kernel_size, stride=stride),
nn.BatchNorm2d(out_dim),
nn.ReLU()
)
This exercise is part of the course
Deep Learning for Images with PyTorch
Exercise instructions
- Add the last generator block, mapping the size of the feature maps to
256
. - Add a transposed convolution with the output size of
3
. - Add the tanh activation.
Hands-on interactive exercise
Have a go at this exercise by completing this sample code.
class DCGenerator(nn.Module):
def __init__(self, in_dim, kernel_size=4, stride=2):
super(DCGenerator, self).__init__()
self.in_dim = in_dim
self.gen = nn.Sequential(
dc_gen_block(in_dim, 1024, kernel_size, stride),
dc_gen_block(1024, 512, kernel_size, stride),
# Add last generator block
____,
# Add transposed convolution
____(____, ____, kernel_size, stride=stride),
# Add tanh activation
____
)
def forward(self, x):
x = x.view(len(x), self.in_dim, 1, 1)
return self.gen(x)