Mathwizurd.com is created by David Witten, a mathematics and computer science student at Stanford University. For more information, see the "About" page.

Moment Generating Function

Moments

MathJax TeX Test Page The definition of a moment is the product of distance and some other value. For example, in physics, the first moment of force is torque, which equals $rF$. If you have a system of angular forces, you take the sum of all of the $rF$'s. In statistics, it's very similar. The first moment is the mean: $$E(x) = \sum_{i=1}^n p_ix_i $$ Here is the second moment: $$E(x^2) = \sum_{i=1}^n p_ix_i^2 $$ It turns out both of the above are very useful, as this is variance: $$Var(X) = \sum_{i=1}^n p_i(x_i - \mu)^2$$ To think about why that makes sense, imagine the probabilities were uniform, meaning $p(x_i)=\frac{1}{n}$, making the variance equal to $$\frac{1}{n}\sum_{i=1}^n{(x_i - \mu)^2}$$ Now, let's return to the formula above: $$Var(X) = \sum_{i=1}^n p_i(x_i - \mu)^2$$ Note that the mean = $\mu = E(X)$ $$=\sum_{i=1}^n p_i(x_i^2 - 2\mu{}x_i + \mu^2)$$ $$=\sum_{i=1}^n p_ix_i^2 -2\mu\sum_^n p_ix_i + \mu^2\sum_^n p_i$$ $$=E(X^2) - 2\mu E(X) + \mu = E(X^2) - 2E(X)^2 + E(X)^2$$ $$=E(X^2) - E(X)^2$$

Moment Generating Function

As we can see, we don’t actually have to calculate variance on its own ever. If we calculate the first and second moments, we can figure out the variance. The moment generating function does just this. We can easily find the first and second moment.


Here it is:

MathJax TeX Test Page $$M_x(t) = E(e^) = \sum_{x \in D}e^p(x)$$ Let's evaluate it and its derivatives at 0: $$M_x(0) = \sum_{x \in D}e^p(x) = \sum_{x \in D}p(x) = 1$$ $$M'_x(0) = \sum_{x \in D}xe^p(x) = \sum_{x \in D}xe^p(x) = \sum_{x \in D}xp(x) = E(X)$$ $$M''_x(0) = \sum_{x \in D}x^2e^p(x) = \sum_{x \in D}x^2e^p(x) = \sum_{x \in D}x^2p(x) = E(X^2)$$ $e^{tx}$ has the quality where differentiating just adds a factor of $x$ in front. So, let's apply this great quality in a few problems.

Example

MathJax TeX Test Page We want to find the mean and variance of a binomial distribution. Recall that a binomial distribution is this: $$p(k) = {n \choose k}p^k\left(1-p\right)^{n-k}$$ It is very difficult to find the variance of this normally. However, if we take the moment generating function, it becomes very easy. $$M(t) = \sum_0^{n}{n \choose k}p^x\left(1-p\right)^{n-k}$$ $$M(t) = \sum_0^{n}{n \choose k}\left(pe^t\right)^x\left(1-p\right)^{n-k}$$ We can recognize this is the binomial theorem. $$M(t) = (pe^t + 1 - p)^n$$ Let's just confirm this makes sense. M(0) should equal 1. $$M(0) = (pe^0 + 1 - p)^n = (p + 1 - p)^n = 1^n = 1$$ $$M'(t) = n(pe^t + 1 - p)^{n-1}pe^{t} = np(p - p + 1)^{n-1} = \boxed{np}$$ $$M''(t) = n(n-1)(pe^t + 1 - p)^{n-2}p^2e^{2t} + n(pe^t + 1 - p)^{n-1}pe^{t}$$ $$M''(0) = n(n-1)(p + 1 - p)^{n-2}p^2 + n(p + 1 - p)^{n-1}p = n(n-1)p^2 + np = n^2p^2 - np^2 + np$$ $$Var(X) = M''(0) - M'(0)^2 = n^2p^2 - np^2 + np - n^2p^2 = np - np^2 = \boxed{np(1-p)}$$

Probability Axioms

Set Identities