Sometimes it tends to get clogged, so in a memorandum
class ResNet50(Model):
def __init__(self, stride: int = 1, *args, **kwargs):
super(ResNet50, self).__init__(*args, **kwargs)
self.stride = stride
self.avgpool = AveragePooling2D()
self.maxpool = MaxPool2D(padding='same')
self.ResBlocks: List[Layers] = []
self.softmax = Softmax()
self.dense = Dense(1, activation='sigmoid')
def call(self, inputs):
conv_1 = self.conv(inputs)
maxpooled = self.maxpool(conv_1)
layers_num = [3, 4, 6, 3]
for i in range(len(layers_num)):
for _ in range(layers_num[i]):
if i==0 and u==0:
self.ResBlocks.append(Residual_Block(filters_num=4 * 2 ** (i))(maxpooled))
else:
self.ResBlocks.append(Residual_Block(filters_num=4 * 2 ** (i))(self.ResBlocks[-1])))
avgpooled = self.avgpool(maxpooled)
value = self.dense(avgpooled)
return avgpooled, value
I did something like that and got a tf.function-decorated function tried to create variables on non-first call. If you look it up, the tensorflow declaration variable will be newly declared.
class ResNet50(Model):
def __init__(self, stride: int = 1, *args, **kwargs):
super(ResNet50, self).__init__(*args, **kwargs)
self.stride = stride
self.avgpool = AveragePooling2D()
self.maxpool = MaxPool2D(padding='same')
self.ResBlocks: List[Layers] = []
layers_num = [3, 4, 6, 3]
for i in range(len(layers_num)):
for _ in range(layers_num[i]):
self.ResBlocks.append(Residual_Block(filters_num=4 * 2 ** (i)))
self.conv = Conv2D(filters=16, kernel_size=7, strides=self.stride, padding='same')
self.softmax = Softmax()
self.dense = Dense(1, activation='sigmoid')
def call(self, inputs):
conv_1 = self.conv(inputs)
maxpooled = self.maxpool(conv_1)
for layer in self.ResBlocks:
maxpooled = layer(maxpooled)
avgpooled = self.avgpool(maxpooled)
value = self.dense(avgpooled)
return avgpooled, value
Then it was fixed. The cause was that when declaring a layer in the Model specification of tensorflow.keras, it must be in the init part. I completely forgot. Is the declaration of variables related all over the place? I would like to find out even when I have time next time.
Recommended Posts