It's a memorandum because it's jammed. I haven't moved it yet, so if you have any mistakes, please let me know.
In pytorch, ReflectionPadding2D behaves as follows. You can find out more about it by looking at the official documentation .
>>> m = nn.ReflectionPad2d(2)
>>> input = torch.arange(9, dtype=torch.float).reshape(1, 1, 3, 3)
>>> input
tensor([[[[0., 1., 2.],
[3., 4., 5.],
[6., 7., 8.]]]])
>>> m(input)
tensor([[[[8., 7., 6., 7., 8., 7., 6.],
[5., 4., 3., 4., 5., 4., 3.],
[2., 1., 0., 1., 2., 1., 0.],
[5., 4., 3., 4., 5., 4., 3.],
[8., 7., 6., 7., 8., 7., 6.],
[5., 4., 3., 4., 5., 4., 3.],
[2., 1., 0., 1., 2., 1., 0.]]]])
>>> # using different paddings for different sides
>>> m = nn.ReflectionPad2d((1, 1, 2, 0))
>>> m(input)
tensor([[[[7., 6., 7., 8., 7.],
[4., 3., 4., 5., 4.],
[1., 0., 1., 2., 1.],
[4., 3., 4., 5., 4.],
[7., 6., 7., 8., 7.]]]])
pytorch official documentation https://pytorch.org/docs/stable/nn.html
If you try to achieve this with tensorflow, you will be using a tensorflow pad. (It's written in official documentation ...) I wrote an article because I couldn't get it out of my own way.
tf.pad(
tensor,
paddings,
mode='REFLECT',
constant_values=0,
name=None
)
An example is as follows.
t = tf.constant([[1, 2, 3], [4, 5, 6]])
paddings = tf.constant([[1, 1,], [2, 2]])
tf.pad(t, paddings, "REFLECT")
# [[6, 5, 4, 5, 6, 5, 4],
# [3, 2, 1, 2, 3, 2, 1],
# [6, 5, 4, 5, 6, 5, 4],
# [3, 2, 1, 2, 3, 2, 1]]
From the above two examples official document https://www.tensorflow.org/api_docs/python/tf/pad
Recommended Posts