Cross-correlation matrix
Find sources: "Cross-correlation matrix" – news · newspapers · books · scholar · JSTOR (December 2009) (Learn how and when to remove this message)
| Part of a series on Statistics |
| Correlation and covariance |
|---|
|
For stochastic processes |
|
For deterministic signals |
The cross-correlation matrix of two random vectors is a matrix containing as elements the cross-correlations of all pairs of elements of the random vectors. The cross-correlation matrix is used in various digital signal processing algorithms.
Definition
[edit ]For two random vectors {\displaystyle \mathbf {X} =(X_{1},\ldots ,X_{m})^{\rm {T}}} and {\displaystyle \mathbf {Y} =(Y_{1},\ldots ,Y_{n})^{\rm {T}}}, each containing random elements whose expected value and variance exist, the cross-correlation matrix of {\displaystyle \mathbf {X} } and {\displaystyle \mathbf {Y} } is defined by[1] : p.337
{\displaystyle \operatorname {R} _{\mathbf {X} \mathbf {Y} }\triangleq \ \operatorname {E} [\mathbf {X} \mathbf {Y} ^{\rm {T}}]}
and has dimensions {\displaystyle m\times n}. Written component-wise:
- {\displaystyle \operatorname {R} _{\mathbf {X} \mathbf {Y} }={\begin{bmatrix}\operatorname {E} [X_{1}Y_{1}]&\operatorname {E} [X_{1}Y_{2}]&\cdots &\operatorname {E} [X_{1}Y_{n}]\\\\\operatorname {E} [X_{2}Y_{1}]&\operatorname {E} [X_{2}Y_{2}]&\cdots &\operatorname {E} [X_{2}Y_{n}]\\\\\vdots &\vdots &\ddots &\vdots \\\\\operatorname {E} [X_{m}Y_{1}]&\operatorname {E} [X_{m}Y_{2}]&\cdots &\operatorname {E} [X_{m}Y_{n}]\\\\\end{bmatrix}}}
The random vectors {\displaystyle \mathbf {X} } and {\displaystyle \mathbf {Y} } need not have the same dimension, and either might be a scalar value.
Example
[edit ]For example, if {\displaystyle \mathbf {X} =\left(X_{1},X_{2},X_{3}\right)^{\rm {T}}} and {\displaystyle \mathbf {Y} =\left(Y_{1},Y_{2}\right)^{\rm {T}}} are random vectors, then {\displaystyle \operatorname {R} _{\mathbf {X} \mathbf {Y} }} is a {\displaystyle 3\times 2} matrix whose {\displaystyle (i,j)}-th entry is {\displaystyle \operatorname {E} [X_{i}Y_{j}]}.
Complex random vectors
[edit ]If {\displaystyle \mathbf {Z} =(Z_{1},\ldots ,Z_{m})^{\rm {T}}} and {\displaystyle \mathbf {W} =(W_{1},\ldots ,W_{n})^{\rm {T}}} are complex random vectors, each containing random variables whose expected value and variance exist, the cross-correlation matrix of {\displaystyle \mathbf {Z} } and {\displaystyle \mathbf {W} } is defined by
- {\displaystyle \operatorname {R} _{\mathbf {Z} \mathbf {W} }\triangleq \ \operatorname {E} [\mathbf {Z} \mathbf {W} ^{\rm {H}}]}
where {\displaystyle {}^{\rm {H}}} denotes Hermitian transposition.
Uncorrelatedness
[edit ]Two random vectors {\displaystyle \mathbf {X} =(X_{1},\ldots ,X_{m})^{\rm {T}}} and {\displaystyle \mathbf {Y} =(Y_{1},\ldots ,Y_{n})^{\rm {T}}} are called uncorrelated if
- {\displaystyle \operatorname {E} [\mathbf {X} \mathbf {Y} ^{\rm {T}}]=\operatorname {E} [\mathbf {X} ]\operatorname {E} [\mathbf {Y} ]^{\rm {T}}.}
They are uncorrelated if and only if their cross-covariance matrix {\displaystyle \operatorname {K} _{\mathbf {X} \mathbf {Y} }} matrix is zero.
In the case of two complex random vectors {\displaystyle \mathbf {Z} } and {\displaystyle \mathbf {W} } they are called uncorrelated if
- {\displaystyle \operatorname {E} [\mathbf {Z} \mathbf {W} ^{\rm {H}}]=\operatorname {E} [\mathbf {Z} ]\operatorname {E} [\mathbf {W} ]^{\rm {H}}}
and
- {\displaystyle \operatorname {E} [\mathbf {Z} \mathbf {W} ^{\rm {T}}]=\operatorname {E} [\mathbf {Z} ]\operatorname {E} [\mathbf {W} ]^{\rm {T}}.}
Properties
[edit ]Relation to the cross-covariance matrix
[edit ]The cross-correlation is related to the cross-covariance matrix as follows:
- {\displaystyle \operatorname {K} _{\mathbf {X} \mathbf {Y} }=\operatorname {E} [(\mathbf {X} -\operatorname {E} [\mathbf {X} ])(\mathbf {Y} -\operatorname {E} [\mathbf {Y} ])^{\rm {T}}]=\operatorname {R} _{\mathbf {X} \mathbf {Y} }-\operatorname {E} [\mathbf {X} ]\operatorname {E} [\mathbf {Y} ]^{\rm {T}}}
- Respectively for complex random vectors:
- {\displaystyle \operatorname {K} _{\mathbf {Z} \mathbf {W} }=\operatorname {E} [(\mathbf {Z} -\operatorname {E} [\mathbf {Z} ])(\mathbf {W} -\operatorname {E} [\mathbf {W} ])^{\rm {H}}]=\operatorname {R} _{\mathbf {Z} \mathbf {W} }-\operatorname {E} [\mathbf {Z} ]\operatorname {E} [\mathbf {W} ]^{\rm {H}}}
See also
[edit ]- Autocorrelation
- Correlation does not imply causation
- Covariance function
- Pearson product-moment correlation coefficient
- Correlation function (astronomy)
- Correlation function (statistical mechanics)
- Correlation function (quantum field theory)
- Mutual information
- Rate distortion theory
- Radial distribution function
References
[edit ]- ^ Gubner, John A. (2006). Probability and Random Processes for Electrical and Computer Engineers. Cambridge University Press. ISBN 978-0-521-86470-1.
Further reading
[edit ]- Hayes, Monson H., Statistical Digital Signal Processing and Modeling, John Wiley & Sons, Inc., 1996. ISBN 0-471-59431-8.
- Solomon W. Golomb, and Guang Gong. Signal design for good correlation: for wireless communication, cryptography, and radar. Cambridge University Press, 2005.
- M. Soltanalian. Signal Design for Active Sensing and Communications. Uppsala Dissertations from the Faculty of Science and Technology (printed by Elanders Sverige AB), 2014.