I'm wondering is list can backward in Pytorch? Here is what I mean.
def forward(self, input):
feature = []
x1 = self.conv1(input)
feature.append(x1)
x1 =self.maxpool1(x1)
x2 = self.conv2(x1)
feature.append(x2)
x2 =self.maxpool1(x2)
x3 = self.conv1(x2)
feature.append(x3)
x3 =self.maxpool1(x3)
x4 = self.conv1(x3)
feature.append(x4)
return feature
conv and maxpool are just the basic module of Pytorch.
My question is that, if I write my own layer of forward like this, can Pytorch back-propgation normally. I mean whether all of the variables in Pytorch should be tensor?