[Statistics review] Four arithmetic operations of random variables

Introduction

Now that I've started reviewing ** Statistics **, I'll summarize my own interpretation as a memorandum. This time, I will summarize the most basic ** four arithmetic operations of random variables ** in statistics, and in the future I will summarize ** estimation, testing, multivariate analysis, and Bayesian statistics **. I will try to be helpful for those who want to review statistics and those who want to check their understanding of statistics.

What is Statistical Inference?

What is inference statistics in the first place? A famous anecdote about inference statistics is the story of ** Poincare and Bakery **.

The anecdote is as follows. Poincare is a familiar bakery, and he often bought bread weighing 1000g. However, Poincare, who felt that the weight was being cheated, decided to weigh the bread he bought every time. A year later, Poincare confirmed that the bread he had bought so far had a normal distribution with an average weight of 950g, and he found out the fraud.

Now, how much bread did Poincaré actually need to buy to detect the injustice of the bakery? Using statistical interval estimation, we can see that ** buying about 70 pieces of bread is 99.98% certainty that the bakery is cheating ** (quite cautious). * Strictly speaking, it is necessary to determine the number of specimens in advance in order to avoid making the sampling intentional.

imagedraw (1).gif

Four arithmetic operations in statistics

Before we go to interval estimation of statistics, we need to understand ** formulas of four arithmetic operations in statistics **. We are deriving the formula, but if it is troublesome, you can understand it to some extent by checking only the results and comments.

For convenience, define the following symbols and formulas: Also, for the sake of simplicity, we will assume that the weight of the bread is a discrete value that takes only an integer value, but in the case of a continuous value (real number), just set $ \ sum_ {} $ to $ \ int $.

** Definition of symbols and formulas ** </ font> ・ $ X = (x_1, x_2, x_3 ・ ・ ・ x_n): $ ** Random variable ** (values that can take the weight of bread, for example 900g, 901g, 902g ・ ・ ・ 1000g, 1001g, 1002g, etc.)

・ $ F (x): $ ** Probability function ** (Probability that the weight of bread is x, for example, f (900g) = 0.005, f (1000g) = 0.1, etc.)

・ $ E (X) = \ sum_ {k} x_ {k} f (x_k) = μ_X: $ ** Expected value of random variable $ X $ ** ($ \ sum $ Bread weight x its weight Bread appearance probability)

・ $ V (X) = \ sum_ {k} (x_k-μ_X) ^ 2f (x_k): $ ** Variance of random variable $ X $ ** ($ \ sum $ (bread weight-of the whole bread) Average weight) $ ^ 2 $ × Probability of appearance of bread of that weight)

In deriving the four rules of expected value and variance of random variables, we will assume ** independent random variables $ X $ and $ Y $ ** below. The fact that X and Y are independent of each other means that $ f (x_i, y_j) = f (x_i) × f (y_j) $ holds, and there is no relationship between the probability of $ x_i $ and the probability of $ y_i $. ..

** ➀ Expected value of "random variable $ X $ + random variable $ Y $" ** </ font> Find the expected value of the random variable $ X + Y $, which is the random variable $ X $ plus the random variable $ Y $.

\begin{align}
E(X+Y)&=\sum_{j,k}(x_j+y_k)f(x_j,y_k)\\
&=\sum_{j,k}x_jf(x_j,y_k)+\sum_{j,k}y_kf(x_j,y_k)\\
&=\sum_{j}x_jf(x_j)\sum_{k}f(y_k)+\sum_{k}y_kf(y_k)\sum_{j}f(x_j)\\
&=\sum_{j}x_jf(x_j)+\sum_{k}y_kf(y_k)\\
&=E(X)+E(Y)
\end{align}

The formula on the third line is called the peripheral probabilities of $ X $ and $ Y $ (probabilities of excluding the effects of $ X $ and $ Y $, respectively). In the end, the expected value of the random variable $ X + Y $ is just the sum of the expected values of $ X $ and $ Y $. Subtraction only changes the sign.

** ➁ Expected value of "Random variable $ X $ x Random variable $ Y $" ** </ font> Find the expected value of the random variable $ X × Y $, which is the product of the random variable $ X $ and the random variable $ Y $. As for the meaning of $ \ sum_ {j, k} $, it is easy to understand the image like a full search that adds values in all possible $ j $ and $ k $.

\begin{align}
E(X×Y)=&\sum_{j,k}\bigl(x_jy_kf(x_j,y_k)\bigr)\\

=&\sum_{j}x_jf(x_j)\sum_{k}y_kf(y_k)\\
=&E(X)E(Y)
\end{align}

The expected value of the random variable $ X × Y $ is now just the product of the expected value of the random variable $ X $ and $ Y $. In division, you can use the random variable $ 1 / Y $ instead of $ Y $.

** ➂ Expected value of "random variable $ X $ x constant $ a $" ** </ font> Multiply the random variable $ X $ by the constant a to find the expected value of the random variable $ a × X $.

\begin{align}
E(a×X)&=\sum_{j}\bigl(a(x_j)f(x_j)\bigr)\\
&=ax_1f(x_1)+ax_2f(x_2)・ ・ ・\\
&=aE(X)
\end{align}

The formula is just multiplied by a constant. Division simply turns the constant $ a $ into $ 1 / a $.

** ➃ Variance of "Random variable $ X $ + Random variable $ Y $" ** </ font> Find the variance of the random variable $ X + Y $, which is the sum of the random variable $ X $ and the random variable $ Y $. For convenience, the expected value of the random variable $ X $ is expressed as $ E (X) = μ_ {X} $.

\begin{align}
V(X+Y)=&\sum_{j,k}\bigl(x_j+y_k-(μ_X+μ_Y)\bigr)^2f(x_j,y_k)\\
=&\sum_{j,k}\bigl(x_j^2+y_k^2+μ_X^2+μ_Y^2+2x_jy_k-2x_jμ_X-2x_jμ_Y-2y_kμ_X-2y_kμ_Y+2μ_Xμ_Y\bigr)f(x_j,y_k)\\
=&\sum_{j,k}\bigl(x_j-μ_X\bigr)^2f(x_j,y_k)+\sum_{j,k}\bigl(y_k-μ_Y\bigr)^2f(x_j,y_k)+2\sum_{j,k}\bigl(x_j-μ_X\bigr)\bigl(y_k-μ_Y\bigr)f(x_j,y_k)\\
=&\sum_{j}\bigl(x_j-μ_X\bigr)^2f(x_j)+\sum_{k}\bigl(y_k-μ_Y\bigr)^2f(y_k)+2\sum_{j}\bigl(x_j-μ_X\bigr)f(x_j)\sum_{k}\bigl(y_k-μ_Y\bigr)f(y_k)\\
=&V(X)+V(Y)
\end{align}

The variance of the random variable $ X + Y $ is just the sum of the variances of the random variables $ X $ and $ Y $. By the way, the third item on the third line is a statistic called covariance that shows the correlation between $ X $ and $ Y $, and if $ X $ and $ Y $ are not independent, it will not be 0, so consider the variance. You will need to do it.

** ➄ Variance of "random variable $ X $ x random variable $ Y $" ** </ font> Multiply the random variable $ X $ by the random variable $ Y $ to find the variance of the random variable $ X $ × $ Y $.

\begin{align}
V(X×Y)=&\sum_{j,k}\bigl(x_jy_k-μ_Xμ_Y\bigr)^2f(x_j,y_k)\\
=&\sum_{j,k}\bigl((x_jy_k)^2-2x_jy_kμ_Xμ_Y+(μ_Xμ_Y)^2\bigr)f(x_j,y_k)\\
=&\sum_{j,k}(x_jy_k)^2f(x_j,y_k)-μ_Xμ_Y\sum_{j,k}2x_jy_kf(x_j,y_k)+(μ_Xμ_Y)^2\\
=&\sum_{j,k}(x_jy_k)^2f(x_j,y_k)-2(μ_Xμ_Y)^2+(μ_Xμ_Y)^2\\
=&\sum_{j}x_j^2f(x_j)\sum_{k}y_k^2f(y_k)-(μ_Xμ_Y)^2 \\
=&E(X^2)E(Y^2)-(μ_Xμ_Y)^2\\
=&\bigl(V(X)+μ_X^2\bigr)\bigl(V(Y)+μ_Y^2\bigr)-(μ_Xμ_Y)^2 ※V(X)=\sum_{k}(x_k-μ_X)^2f(x_k)=E(X^2)-μ_X^2\\
=&V(X)V(Y)+μ_Y^2V(X)+μ_X^2V(Y)
\end{align}

The variance of the random variable $ X × Y $ is $ V (X) V (Y) + μ_Y ^ 2 V (X) + μ_X ^ 2V (Y) using the expected value and variance of the random variables $ X $ and $ Y $. ) $. Note that it does not simply become $ V (X × Y) = V (X) V (Y) $ like the expected value.

** ➅ Variance of "random variable $ X $ x constant $ a $" ** </ font> We will find the variance of the random variable $ a × X $, which is the random variable $ X $ multiplied by the constant a.

\begin{align}
V(a×X)=&\sum_{j}(ax_j-μ_{ax})^2f(x_j)\\
=&(ax_1-aμ_x)^2+(ax_2-aμ_x)^2+(ax_3-aμ_x)^2+・ ・ ・\\
=&a^2\sum_{j}(x_j-μ_{x})^2f(x_j)\\
=&a^2V(X)
\end{align}

Note that multiplying the random variable by $ a $ will multiply the variance by $ a ^ 2 $.

next time

Next time, I will work on actually estimating the interval of the bread weight distribution as shown in the figure at the beginning, using the formula of the four arithmetic operations of the statistics derived this time.

reference

Basics of statistical analysis understood by Python Probability distribution of statistical test level 2 learned in Python ① Explanation of impartiality and consistency for those who do not fall in love [Statistics] t-distribution, law of large numbers, central limit theorem