Cross-covariance matrix
Part of a series on Statistics |
Correlation and covariance |
---|
For stochastic processes |
For deterministic signals |
In probability theory and statistics, a cross-covariance matrix is a matrix whose element in the i, j position is the covariance between the i-th element of a random vector and j-th element of another random vector. When the two random vectors are the same, the cross-covariance matrix is referred to as covariance matrix. A random vector is a random variable with multiple dimensions. Each element of the vector is a scalar random variable. Each element has either a finite number of observed empirical values or a finite or infinite number of potential values. The potential values are specified by a theoretical joint probability distribution. Intuitively, the cross-covariance matrix generalizes the notion of covariance to multiple dimensions.
The cross-covariance matrix of two random vectors {\displaystyle \mathbf {X} } and {\displaystyle \mathbf {Y} } is typically denoted by {\displaystyle \operatorname {K} _{\mathbf {X} \mathbf {Y} }} or {\displaystyle \Sigma _{\mathbf {X} \mathbf {Y} }}.
Definition
[edit ]For random vectors {\displaystyle \mathbf {X} } and {\displaystyle \mathbf {Y} }, each containing random elements whose expected value and variance exist, the cross-covariance matrix of {\displaystyle \mathbf {X} } and {\displaystyle \mathbf {Y} } is defined by[1] : 336
where {\displaystyle \mathbf {\mu _{X}} =\operatorname {E} [\mathbf {X} ]} and {\displaystyle \mathbf {\mu _{Y}} =\operatorname {E} [\mathbf {Y} ]} are vectors containing the expected values of {\displaystyle \mathbf {X} } and {\displaystyle \mathbf {Y} }. The vectors {\displaystyle \mathbf {X} } and {\displaystyle \mathbf {Y} } need not have the same dimension, and either might be a scalar value.
The cross-covariance matrix is the matrix whose {\displaystyle (i,j)} entry is the covariance
- {\displaystyle \operatorname {K} _{X_{i}Y_{j}}=\operatorname {cov} [X_{i},Y_{j}]=\operatorname {E} [(X_{i}-\operatorname {E} [X_{i}])(Y_{j}-\operatorname {E} [Y_{j}])]}
between the i-th element of {\displaystyle \mathbf {X} } and the j-th element of {\displaystyle \mathbf {Y} }. This gives the following component-wise definition of the cross-covariance matrix.
- {\displaystyle \operatorname {K} _{\mathbf {X} \mathbf {Y} }={\begin{bmatrix}\mathrm {E} [(X_{1}-\operatorname {E} [X_{1}])(Y_{1}-\operatorname {E} [Y_{1}])]&\mathrm {E} [(X_{1}-\operatorname {E} [X_{1}])(Y_{2}-\operatorname {E} [Y_{2}])]&\cdots &\mathrm {E} [(X_{1}-\operatorname {E} [X_{1}])(Y_{n}-\operatorname {E} [Y_{n}])]\\\\\mathrm {E} [(X_{2}-\operatorname {E} [X_{2}])(Y_{1}-\operatorname {E} [Y_{1}])]&\mathrm {E} [(X_{2}-\operatorname {E} [X_{2}])(Y_{2}-\operatorname {E} [Y_{2}])]&\cdots &\mathrm {E} [(X_{2}-\operatorname {E} [X_{2}])(Y_{n}-\operatorname {E} [Y_{n}])]\\\\\vdots &\vdots &\ddots &\vdots \\\\\mathrm {E} [(X_{m}-\operatorname {E} [X_{m}])(Y_{1}-\operatorname {E} [Y_{1}])]&\mathrm {E} [(X_{m}-\operatorname {E} [X_{m}])(Y_{2}-\operatorname {E} [Y_{2}])]&\cdots &\mathrm {E} [(X_{m}-\operatorname {E} [X_{m}])(Y_{n}-\operatorname {E} [Y_{n}])]\end{bmatrix}}}
Example
[edit ]For example, if {\displaystyle \mathbf {X} =\left(X_{1},X_{2},X_{3}\right)^{\rm {T}}} and {\displaystyle \mathbf {Y} =\left(Y_{1},Y_{2}\right)^{\rm {T}}} are random vectors, then {\displaystyle \operatorname {cov} (\mathbf {X} ,\mathbf {Y} )} is a {\displaystyle 3\times 2} matrix whose {\displaystyle (i,j)}-th entry is {\displaystyle \operatorname {cov} (X_{i},Y_{j})}.
Properties
[edit ]For the cross-covariance matrix, the following basic properties apply:[2]
- {\displaystyle \operatorname {cov} (\mathbf {X} ,\mathbf {Y} )=\operatorname {E} [\mathbf {X} \mathbf {Y} ^{\rm {T}}]-\mathbf {\mu _{X}} \mathbf {\mu _{Y}} ^{\rm {T}}}
- {\displaystyle \operatorname {cov} (\mathbf {X} ,\mathbf {Y} )=\operatorname {cov} (\mathbf {Y} ,\mathbf {X} )^{\rm {T}}}
- {\displaystyle \operatorname {cov} (\mathbf {X_{1}} +\mathbf {X_{2}} ,\mathbf {Y} )=\operatorname {cov} (\mathbf {X_{1}} ,\mathbf {Y} )+\operatorname {cov} (\mathbf {X_{2}} ,\mathbf {Y} )}
- {\displaystyle \operatorname {cov} (A\mathbf {X} +\mathbf {a} ,B^{\rm {T}}\mathbf {Y} +\mathbf {b} )=A,円\operatorname {cov} (\mathbf {X} ,\mathbf {Y} ),円B}
- If {\displaystyle \mathbf {X} } and {\displaystyle \mathbf {Y} } are independent (or somewhat less restrictedly, if every random variable in {\displaystyle \mathbf {X} } is uncorrelated with every random variable in {\displaystyle \mathbf {Y} }), then {\displaystyle \operatorname {cov} (\mathbf {X} ,\mathbf {Y} )=0_{p\times q}}
where {\displaystyle \mathbf {X} }, {\displaystyle \mathbf {X_{1}} } and {\displaystyle \mathbf {X_{2}} } are random {\displaystyle p\times 1} vectors, {\displaystyle \mathbf {Y} } is a random {\displaystyle q\times 1} vector, {\displaystyle \mathbf {a} } is a {\displaystyle q\times 1} vector, {\displaystyle \mathbf {b} } is a {\displaystyle p\times 1} vector, {\displaystyle A} and {\displaystyle B} are {\displaystyle q\times p} matrices of constants, and {\displaystyle 0_{p\times q}} is a {\displaystyle p\times q} matrix of zeroes.
Definition for complex random vectors
[edit ]If {\displaystyle \mathbf {Z} } and {\displaystyle \mathbf {W} } are complex random vectors, the definition of the cross-covariance matrix is slightly changed. Transposition is replaced by Hermitian transposition:
- {\displaystyle \operatorname {K} _{\mathbf {Z} \mathbf {W} }=\operatorname {cov} (\mathbf {Z} ,\mathbf {W} ){\stackrel {\mathrm {def} }{=}}\ \operatorname {E} [(\mathbf {Z} -\mathbf {\mu _{Z}} )(\mathbf {W} -\mathbf {\mu _{W}} )^{\rm {H}}]}
For complex random vectors, another matrix called the pseudo-cross-covariance matrix is defined as follows:
- {\displaystyle \operatorname {J} _{\mathbf {Z} \mathbf {W} }=\operatorname {cov} (\mathbf {Z} ,{\overline {\mathbf {W} }}){\stackrel {\mathrm {def} }{=}}\ \operatorname {E} [(\mathbf {Z} -\mathbf {\mu _{Z}} )(\mathbf {W} -\mathbf {\mu _{W}} )^{\rm {T}}]}
Uncorrelatedness
[edit ]Two random vectors {\displaystyle \mathbf {X} } and {\displaystyle \mathbf {Y} } are called uncorrelated if their cross-covariance matrix {\displaystyle \operatorname {K} _{\mathbf {X} \mathbf {Y} }} matrix is a zero matrix.[1] : 337
Complex random vectors {\displaystyle \mathbf {Z} } and {\displaystyle \mathbf {W} } are called uncorrelated if their covariance matrix and pseudo-covariance matrix is zero, i.e. if {\displaystyle \operatorname {K} _{\mathbf {Z} \mathbf {W} }=\operatorname {J} _{\mathbf {Z} \mathbf {W} }=0}.
References
[edit ]- ^ a b Gubner, John A. (2006). Probability and Random Processes for Electrical and Computer Engineers. Cambridge University Press. ISBN 978-0-521-86470-1.
- ^ Taboga, Marco (2010). "Lectures on probability theory and mathematical statistics".