Let $Y_1, Y2, \cdots, Y_n$ be independent random variables such that each $Y_i$ has a gamma distribution with parameters $\alpha_i$ and $\beta$. That is, the distributions of the $Y$'s might have different $\alpha$'s, but all have the same value for $\beta$. Prove that $U = Y_1 + Y_2 +\cdots +Y_n$ has a gamma distribution with parameters $\alpha_1+\alpha_2+\cdots +\alpha_n$ and $\beta$.

#### Solution

Let $Y_1, Y2, \cdots, Y_n$ be independent random variables such that each $Y_i$ has a gamma distribution with parameters $\alpha_i$ and $\beta$.

The m.g.f. of $Y_i$ is

` $$ \begin{aligned} M_{Y_i}(t) &= E(e^{tY_i}) \\ &= \big(1-\beta t\big)^{-\alpha_i} \end{aligned} $$ `

Let $U = Y_1 + Y_2 +\cdots +Y_n$.

Then the m.g.f. of $U$ is

` $$ \begin{aligned} M_U(t) &= E(e^{tU}) \\ &= E(e^{t(Y_1 + Y_2 +\cdots +Y_n)}) \\ &= E(e^{tY_1} e^{tY_2}\cdots e^{tY_n}) \\ &= E(e^{tY_1})E(e^{tY_2})\cdots E(e^{tY_n})\\ & \qquad (\because Y_i's \text{ are independent })\\ &= M_{Y_1}(t)M_{Y_2}(t)\cdots M_{Y_n}(t)\\ &= \big(1-\beta t\big)^{-\alpha_1}\big(1-\beta t\big)^{-\alpha_2}\cdots \big(1-\beta t\big)^{-\alpha_n}\\ &= \big(1-\beta t\big)^{-(\alpha_1+\alpha_2+\cdots+\alpha_n)}. \end{aligned} $$ `

which is the m.g.f. of gamma variate with parameter $(\alpha_1+\alpha_2+\cdots +\alpha_n, \beta)$. Hence, by uniqueness theorem of MGF, $U=Y_1+Y_2+\cdots +Y_n$ is a gamma variate with parameter $(\alpha_1+\alpha_2+\cdots +\alpha_n, \beta)$.

#### Further Reading

- Statistics
- Descriptive Statistics
- Probability Theory
- Probability Distribution
- Hypothesis Testing
- Confidence interval
- Sample size determination
- Non-parametric Tests
- Correlation Regression
- Statistics Calculators