next up previous contents
Next: Order Statistics Up: Multivariate Normal Distribution Previous: The role of c.g.f.   Contents

Cochran's Theorem

This is a very important theorem which allows us to decompose sums of squares into several quadratic forms and identify their distributions and establish their independence. It can be used to great advantage in Analysis of Variance and Regression. The importance of the terms in the model is assessed via the distributions of their sums of squares.



Theorem 4..7

Given ${\bf X}\sim N_p({\bf0,I})$, suppose that ${\bf X}'{\bf X}$ is decomposed into $k$ quadratic forms,
$Q_i={\bf X}'{\bf B}_i{\bf X}$, $i=1,\,2,\,\dots ,\,k$, where the rank of ${\bf B}_i$ is $r_i$ and the ${\bf B}_i$ are positive semidefinite, then any one of the following conditions implies the other two.

(a)
the ranks of the $Q_i$ add to $p$;
(b)
each $Q_i\sim \chi^2_{r_i}$;
(c)
all the $Q_i$ are mutually independent.


Proof We can write

\begin{displaymath}{\bf X}'{\bf X}={\bf X}'{\bf IX}=\sum^k_{i=1}{\bf X}'{\bf B}_i{\bf X}. \end{displaymath}

That is,

\begin{displaymath}{\bf I} = \sum^k_{i=1} {\bf B}_i. \end{displaymath}

(i)
Given (a) we will prove (b).

Select an arbitrary $Q_i$, say $Q_1={\bf X}'{\bf B}_1{\bf X}$. If we make an orthogonal transformation ${\bf X=PY}$ which diagonalizes ${\bf B}_1$, we obtain from

$\displaystyle {\bf X'B_1X+X'(I-B_1)X}$ $\textstyle =$ $\displaystyle {\bf X'IX}$  
$\displaystyle {\bf Y'P'B_1PY+Y'P'(I-B_1)PY}$ $\textstyle =$ $\displaystyle {\bf Y'B'IBY}$ (4.9)
  $\textstyle =$ $\displaystyle {\bf Y'IY}.$  

Since the first and last terms are diagonal, so is the second. Since $r({\bf B}_1)=r_1$ and therefore $r({\bf P'B_1P)}=r_1$, $p-
r_1$ of the leading diagonal elements of ${\bf P'B_1P}$ are zero. Thus the corresponding elements of ${\bf P'(I-B_1)P}$ are $1$ and since by (a) the rank of ${\bf P'(I-B_1)P}$ is $p-
r_1$, the other elements of its leading diagonal are $0$ and the corresponding elements of ${\bf P'B_1P}$ are $1$. Hence from Theorem 4.4, $Q_1\sim \chi^2$ and ${\bf B_1}$ is idempotent.

The same result holds for the other ${\bf B}_i$ and we have established (b) from (a).

(ii)
Given (b) we will prove (c).
\begin{displaymath}
{\bf I=B_1+B_2+\dots + B}_k
\end{displaymath} (4.10)

and (b) implies that each ${\bf B}_i$ is idempotent (with rank $r_i$). Choose an arbitrary ${\bf B}_i$, say ${\bf B}_j$. There is an orthogonal matrix ${\bf C}$ such that

\begin{displaymath}{\bf C'B}_j{\bf C} = \left[ \begin{array}{ll}
{\bf I}_{r_j} & {\bf0}\\
{\bf0} & {\bf0}
\end{array} \right]. \end{displaymath}

Premultiplying (4.10) by ${\bf C}'$ and post-multiplying by ${\bf C}$, we have

\begin{displaymath}{\bf C'IC=I}=\sum^k_{i=1,i\neq j} {\bf C'B}_i{\bf C} + \left[...
...f I}_{r_j} & {\bf0}\\
{\bf0} & {\bf0}
\end{array} \right] . \end{displaymath}

Now each ${\bf C'B}_i{\bf C}$ is idempotent and can't have any negative elements on its diagonal. So ${\bf C'B}_i{\bf C}$ must have the first $r_j$ leading diagonal elements $0$, and submatrices for rows $r_j+1,\,\dots
,\,p$, columns $1,\,\dots ,\,r_j$ and for rows $1,\,\dots ,\,r_j$, columns $r_j+1,\,\dots
,\,p$ must have all elements $0$. So

\begin{displaymath}{\bf C'B}_i{\bf CC'B}_j{\bf C}={\bf0} \ ,
\ \ \ i=1,\,2,\,\dots ,\,k, \ i\neq j,\end{displaymath}

and thus ${\bf C}'{\bf B}_i{\bf B}_j{\bf C}={\bf0}$ which can only be so if ${\bf B}_i{\bf B}_j={\bf0}$.
Since ${\bf B}_j$ was arbitrarily chosen, we have proved (c) from (b).
(iii)
Given (b) we will prove (a).

If (b) holds, ${\bf B}_i$ has $r_i$ eigenvalues $1$ and $p-r_i$ zero and since ${\bf I}=\sum {\bf B}_i$, taking traces we have $p=\sum r_i$.

(iv)
Given (c) we will prove (b).

If (c) holds, taking powers of ${\bf I}=\sum^k_{i=1} {\bf B}_i$, we have $\sum^k_{i=1}{\bf B}^s_i={\bf I}$ for all positive integers $s$. Taking traces we have

\begin{displaymath}\mbox{tr}(\sum^k_{i=1}{\bf B}_i^s)=p \ , \ \ \mbox{for all $s$}. \end{displaymath}

This can hold if and only if every eigenvalue of ${\bf B}_i$ is $1$. That is, if each ${\bf Q}_i \sim \mbox{\boldmath$\chi$}^2$.

So we have proved (b) from (c).

A more general version of Cochran's Theorem is stated (without proof) in Theorem 4.8.



Theorem 4..8

Given ${\bf X} \sim N_p({\bf0},\,\sigma^2{\bf I})$, suppose that ${\bf X'X}$ is decomposed into $k$ quadratic forms, $Q_i = {\bf X'B}_i{\bf X}$, $r=1,\,2,\,\dots ,\,k$, when $r({\bf B}_i)=r_i$. Then $Q_1,\,Q_2,\,\dots
,\,Q_k$ are mutually independent and $Q_i/\sigma^2 \sim \chi^2_{r_j}$ if and only if $\sum^k_{i=1} r_j=p$.



Example 4..2
We will consider again Example 4.4 from the point of view of Cochran's Theorem. Recall that $X_1, \ldots, X_p$ are iid $N(0,1)$ and

\begin{displaymath}\sum_{i=1}^p(x_i-\overline{x})^2=\sum_{i=1}^p x_i^2 \, - \,
...
...{(\sum x_i)^2}{p}
=\sum_{i=1}^p x_i^2 \, - \, p \overline{x}^2.\end{displaymath}

That is,

\begin{displaymath}\sum x_i^2 \, = \, (p-1)s^2 \, + \, p \overline{x}^2,\end{displaymath}

where $s^2$ is defined in the usual way. Equivalently,

\begin{displaymath}{\bf X}'{\bf I}{\bf X} \, = \, {\bf X}'{\bf B}{\bf X} \, + \, {\bf X}'{\bf C}{\bf X},\end{displaymath}

where ${\bf B}$ and ${\bf C}$ are defined in Example 4.1.
We can apply Cochran's Theorem, noting that we can easily show that (a) is true, since $r({\bf I})=p$, $r({\bf B})=p-1$ and $r({\bf C})=1$. So we may conclude that

\begin{displaymath}\sum X_i^2 \sim \chi^2_p, \ \ \nu S^2 \sim \chi^2_{\nu} \mbox...
...ere }
\nu=p-1, \ \mbox{ and } \ p \overline{X}^2 \sim \chi^2_1.\end{displaymath}

and that $\overline{X}$ and $S^2$ are independent.


next up previous contents
Next: Order Statistics Up: Multivariate Normal Distribution Previous: The role of c.g.f.   Contents
Bob Murison 2000-10-31