OI, Ch. 3
Smith College
Mar 4, 2026
\[ S = \{⚀, ⚁, ⚂, ⚃, ⚄, ⚅\} \]
\[ X = \{ 1, 2, 3, 4, 5, 6 \} \]
\[ x_1 = 1, x_2 = 2, \ldots, x_6 = 6 \]
\[ \text{for all } i, \Pr(X = x_i) = \frac{1}{6} \]
mean (aka expected value) of \(X\):
For discrete r.v.’s: \[ \mathbb{E}[X] = \sum_{i=1}^k \Pr(X = x_i) \cdot x_i \]
(sometimes called the first moment)
Also, \(\mu_X = \mathbb{E}[X]\)
variance of \(X\):
For discrete r.v.’s: \[ Var[X] = \sum_{i=1}^k \Pr(X = x_i) \cdot (x_i - \mathbb{E}[X])^2 \]
(sometimes called the second moment)
Also, \(\sigma^2_X = Var[X]\)
Let \(\bar{x}\) be the average value of random sample of \(n\) observations of the random variable \(X\).
Then as \(n\) increases…
If you like calculus:
\[ \lim_{n \rightarrow \infty} \underbrace{\frac{1}{n} \sum_{i=1}^n X_i}_{\bar{x}} = \mu_X \]
For any two r.v.s \(X, Y\) and scalars \(a,b\):
\[ \mathbb{E}[aX + bY] = a \cdot \mathbb{E}[X] + b \cdot \mathbb{E}[Y] \]
If \(X, Y\) are independent:
\[ Var[aX + bY] = a^2 \cdot Var[X] + b^2 \cdot Var[Y] \]
If \(X, Y\) are not independent:
\[ Var[aX + bY] = a^2 \cdot Var[X] + b^2 \cdot Var[Y] \\ + 2 a b \cdot \rho_{X,Y} \cdot sd[X] \cdot sd[Y] \]
where \(\rho_{X,Y}\) is the correlation coefficient
