The subject, the memo when I looked it up.
When creating an instance of the nn.Softmax class, you can specify the axis with the argument dim.
This time, let's take the following array as an example.
input = torch.randn(2, 3)
print(input)
tensor([[-0.2562, -1.2630, -0.1973],
[ 0.8285, -0.9981, 0.3171]])
m = nn.Softmax()
print(m(input))
I get angry like this.
/usr/local/lib/python3.6/dist-packages/ipykernel_launcher.py:2: UserWarning: Implicit dimension choice for softmax has been deprecated. Change the call to include dim=X as an argument.
m = nn.Softmax(dim=0)
print(m(input))
It applies Softmax for each column.
tensor([[0.2526, 0.4342, 0.3742],
[0.7474, 0.5658, 0.6258]])
Just in case, if you aggregate by column, the total of each column will be 1.
torch.sum(m(input), axis=0)
tensor([1., 1., 1.])
m = nn.Softmax(dim=1)
print(m(input))
It applies Softmax line by line.
tensor([[0.4122, 0.1506, 0.4372],
[0.5680, 0.0914, 0.3406]])
Just in case, if you aggregate by row, the total of each row will be 1.
torch.sum(m(input), axis=1)
tensor([1.0000, 1.0000])