Broadcast operation by raising the dimension with np.newaxis

This is an article in which I wrote down my own understanding from the question of what a 3D np.array image is and what is np.newaxis. The first half of the article that describes the image of np.array and the basics of np.newaxis is here. In this article, I will explain how to use np.newaxis. Specifically, when there are two 2D arrays in which multiple 2D vectors are arranged, which was first mentioned in the first half of the article, the square of the distance between the vectors included in both is calculated by rounding. to watch.

STEP 3. Broadcast

Before getting into the main subject, let's review the broadcast. It's a guy who does the calculations between arrays with different shapes.

>>> x = np.array([1, 2])  #Horizontal 2 vertical 1 vector
>>> A = np.array([[1, 2],
                  [3, 4],
                  [5, 6]])  #Horizontal 2 vertical 3 matrix
>>> B = np.array([[1],
                  [2]])  #Horizontal 1 vertical 2 matrix
>>> C = np.array([[[1],
                   [2]],

                  [[3],
                   [4]]])  #3D array with 1 horizontal 2 vertical 2 depth

>>> x + A  #It becomes a matrix of 2 horizontal and 3 vertical
array([[2, 4],
       [4, 6],
       [6, 8]]) 
>>> x + B  #It becomes a matrix of 2 horizontal and 2 vertical
array([[2, 3],
       [3, 4]])
>>> x + C  #It becomes a three-dimensional array with 2 horizontal, 2 vertical, and 2 depth.
array([[[2, 3],
        [3, 4]],

       [[4, 5],
        [5, 6]]])

qiita_broadcast.jpg

STEP 4. Raise the dimension and broadcast operation

Returning to the main subject, what makes np.newaxis happy? That's the story.

For one-dimensional array

Considering a one-dimensional array as a starting point, is it possible to use np.newaxis to "store the brute force operation results of each element of two arrays in a new axis"?

For example, if you have $ x = [1, 3, 5, 7] $ and $ y = [2, 4, 6] $ and you want to brute force $ 4 \ times3 = 12 $, use np.newaxis for $ y. If you convert $ to a matrix of vertical 3 horizontal 1 and then calculate $ x $, you can get a matrix of vertical 3 horizontal 4 by broadcasting.

>>> x = np.array([1, 3, 5, 7])
>>> y = np.array([2, 4, 6])
>>> x[np.newaxis, :] + y[:, np.newaxis]  #Add by brute force of x and y elements
array([[  3,  5,  7,  9],
       [  5,  7,  9, 11],
       [  7,  9, 11, 13]])
# x + y[:, np.newaxis]But the same

qiita_newaxis1.jpg

For a two-dimensional array

But what about two dimensions? Suppose you have two arrays of horizontal vectors of the same dimension (2D in this case) arranged vertically as shown below.

>>> A = np.array([[1, 2],
                  [3, 4],
                  [5, 6]])
>>> B = np.array([[1, 1],
                  [2, 2]])

At this time, if you introduce np.newaxis well, you can perform brute force calculation for each component of the vector. An image of performing a brute force operation on a one-dimensional array multiple times at the same time ($ x $ component and $ y $ component above).

>>> A[np.newaxis, :, :] - B[:, np.newaxis, :]  #Brute force subtraction of x and y components
array([[[ 0, 1],
        [ 2, 3],
        [ 4, 5]],

      [[[-1, 0],
        [ 1, 2],
        [ 3, 4]]])
# A - B[:, np.newaxis, :]But the same

qiita_newaxis2.jpg

If you arrange it so that it broadcasts properly, you can get the same value even if you change the way of taking new axis.

>>> A.T[np.newaxis, :, :] - B[:, :, np.newaxis]  #Brute force subtraction of x and y components
array([[[ 0, 2, 4],
        [ 1, 3, 5]],

      [[[-1, 1, 3],
        [ 0, 2, 4]]])
# A.T - B[:, :, np.newaxis]But the same

qiita_newaxis2_2.jpg

Main subject

When you have the above set of 2D vectors $ A $ and $ B $, let's brute force (distance between vectors) $ ^ 2 $. The square of the distance between $ (a_x, a_y) $ and $ (b_x, b_y) $ is $ (a_x-b_x) ^ 2 + (a_y-b_y) ^ 2 $, so first the $ x $ components, the $ y $ components All you have to do is brute force the squares of the subtractions between each other, which is just the square of the calculation you did above. After that, add the square of the difference of the $ x $ component and the square of the difference of the $ y $ component = Sum on the third axis that is not added to $ A $ or $ B $ as a new axis. (It is easier to understand if you imagine it with a picture).

>>> ((A.T[np.newaxis, :, :] - B[:, :, np.newaxis])**2).sum(axis=1)
array([[ 1, 13, 41],
       [ 1,  5, 25]])
# ((A.T - B[:, :, np.newaxis])**2).sum(axis=1)May be

q_newax2_sum.jpg

This is equivalent to the formula extracted from "Essence of Machine Learning" at the very beginning of Last time. Also, as we saw above, the location of the newaxis you add to $ A $ or $ B $ is not unique and you can get the same result by doing the following:

>>> ((A[np.newaxis, :, :] - B[:, np.newaxis, :])**2).sum(axis=2)
array([[ 1, 13, 41],
       [ 1,  5, 25]])
# ((A - B[:, np.newaxis, :])**2).sum(axis=2)But OK

>>> ((A.T[:, np.newaxis, :] - B.T[:, :, np.newaxis])**2).sum(axis=0)
array([[ 1, 13, 41],
       [ 1,  5, 25]])

To find the distance between vectors, take the square root of this.

Recommended Posts

Broadcast operation by raising the dimension with np.newaxis
Check the operation of OpenCV3 installed by Anaconda
Make Fatjar by changing the main class with Gradle
Play with the power usage API provided by Yahoo
Change the movement by changing the combo box with tkinter
Get the operation status of JR West with Python
Get the full path referenced by .lnk with wsl