Cross-correlation
In signal processing, cross-correlation is a measure of similarity of two series as a function of the displacement of one relative to the other. This is also known as a sliding dot product or sliding inner-product. It is commonly used for searching a long signal for a shorter, known feature. It has applications in pattern recognition, single particle analysis, electron tomography, averaging, cryptanalysis, and neurophysiology. The cross-correlation is similar in nature to the convolution of two functions. In an autocorrelation, which is the cross-correlation of a signal with itself, there will always be a peak at a lag of zero, and its size will be the signal energy.
In probability and statistics, the term cross-correlations refers to the correlations between the entries of two random vectors and, while the correlations of a random vector are the correlations between the entries of itself, those forming the correlation matrix of. If each of and is a scalar random variable which is realized repeatedly in a time series, then the correlations of the various temporal instances of are known as autocorrelations of, and the cross-correlations of with across time are temporal cross-correlations. In probability and statistics, the definition of correlation always includes a standardising factor in such a way that correlations have values between −1 and +1.
If and are two independent random variables with probability density functions and, respectively, then the probability density of the difference is formally given by the cross-correlation ; however, this terminology is not used in probability and statistics. In contrast, the convolution gives the probability density function of the sum.
Cross-correlation of deterministic signals
For continuous functions and, the cross-correlation is defined as:which is equivalent to
where denotes the complex conjugate of, and is the displacement, also known as lag.
If and are both continuous periodic functions of period, the integration from to is replaced by integration over any interval of length :
which is equivalent to
Similarly, for discrete functions, the cross-correlation is defined as:
which is equivalent to
For finite discrete functions, the cross-correlation is defined as:
which is equivalent to
For finite discrete functions,, the kernel cross-correlation is defined as:
where is a vector of kernel functions and is an affine transform.
Specifically, can be circular translation transform, rotation transform, or scale transform, etc. The kernel cross-correlation extends cross-correlation from linear space to kernel space. Cross-correlation is equivariant to translation; kernel cross-correlation is equivariant to any affine transforms, including translation, rotation, and scale, etc.
Explanation
As an example, consider two real valued functions and differing only by an unknown shift along the x-axis. One can use the cross-correlation to find how much must be shifted along the x-axis to make it identical to. The formula essentially slides the function along the x-axis, calculating the integral of their product at each position. When the functions match, the value of is maximized. This is because when peaks are aligned, they make a large contribution to the integral. Similarly, when troughs align, they also make a positive contribution to the integral because the product of two negative numbers is positive.With complex-valued functions and, taking the conjugate of ensures that aligned peaks with imaginary components will contribute positively to the integral.
In econometrics, lagged cross-correlation is sometimes referred to as cross-autocorrelation.
Properties
- The cross-correlation of functions and is equivalent to the convolution of and. That is:
- :
- If is a Hermitian function, then
- If both and are Hermitian, then.
- .
- Analogous to the convolution theorem, the cross-correlation satisfies
- :
- The cross-correlation is related to the spectral density.
- The cross-correlation of a convolution of and with a function is the convolution of the cross-correlation of and with the kernel :
- :.
Cross-correlation of random vectors
Definition
.For random vectors and, each containing random elements whose expected value and variance exist, the cross-correlation matrix of and is defined by
and has dimensions. Written component-wise:
The random vectors and need not have the same dimension, and either might be a scalar value.
Example
For example, if and are random vectors, thenis a matrix whose -th entry is.
Definition for complex random vectors
If and are complex random vectors, each containing random variables whose expected value and variance exist, the cross-correlation matrix of and is defined bywhere denotes Hermitian transposition.
Cross-correlation of stochastic processes
In time series analysis and statistics, the cross-correlation of a pair of random process is the correlation between values of the processes at different times, as a function of the two times. Let be a pair of random processes, and be any point in time. Then is the value produced by a given run of the process at time.Cross-correlation function
Suppose that the process has means and and variances and at time, for each. Then the definition of the cross-correlation between times and iswhere is the expected value operator. Note that this expression may be not defined.
Cross-covariance function
Subtracting the mean before multiplication yields the cross-covariance between times and :Note that this expression is not well-defined for all-time series or processes, because the mean may not exist, or the variance may not exist.
Definition for wide-sense stationary stochastic process
Let represent a pair of stochastic processes that are jointly wide-sense stationary. Then the Cross-covariance function and the cross-correlation function are given as follows.Cross-correlation function
or equivalentlyCross-covariance function
or equivalentlywhere and are the mean and standard deviation of the process, which are constant over time due to stationarity; and similarly for, respectively. indicates the expected value. That the cross-covariance and cross-correlation are independent of is precisely the additional information conveyed by the requirement that are jointly wide-sense stationary.
The cross-correlation of a pair of jointly wide sense stationary stochastic processes can be estimated by averaging the product of samples measured from one process and samples measured from the other. The samples included in the average can be an arbitrary subset of all the samples in the signal. For a large number of samples, the average converges to the true cross-correlation.
Normalization
It is common practice in some disciplines to normalize the cross-correlation function to get a time-dependent Pearson correlation coefficient. However, in other disciplines the normalization is usually dropped and the terms "cross-correlation" and "cross-covariance" are used interchangeably.The definition of the normalized cross-correlation of a stochastic process is
If the function is well-defined, its value must lie in the range, with 1 indicating perfect correlation and −1 indicating perfect anti-correlation.
For jointly wide-sense stationary stochastic processes, the definition is
The normalization is important both because the interpretation of the autocorrelation as a correlation provides a scale-free measure of the strength of statistical dependence, and because the normalization has an effect on the statistical properties of the estimated autocorrelations.
Properties
Symmetry property
For jointly wide-sense stationary stochastic processes, the cross-correlation function has the following symmetry property:Respectively for jointly WSS processes:
Time delay analysis
Cross-correlations are useful for determining the time delay between two signals, e.g., for determining time delays for the propagation of acoustic signals across a microphone array. After calculating the cross-correlation between the two signals, the maximum of the cross-correlation function indicates the point in time where the signals are best aligned; i.e., the time delay between the two signals is determined by the argument of the maximum, or arg max of the cross-correlation, as inTerminology in image processing
Zero-normalized cross-correlation (ZNCC)
For image-processing applications in which the brightness of the image and template can vary due to lighting and exposure conditions, the images can be first normalized. This is typically done at every step by subtracting the mean and dividing by the standard deviation. That is, the cross-correlation of a template, with a subimage iswhere is the number of pixels in and,
is the average of and is standard deviation of.
In functional analysis terms, this can be thought of as the dot product of two normalized vectors. That is, if
and
then the above sum is equal to
where is the inner product and is the L² norm.
Thus, if and are real matrices, their normalized cross-correlation equals the cosine of the angle between the unit vectors and, being thus if and only if equals multiplied by a positive scalar.
Normalized correlation is one of the methods used for template matching, a process used for finding incidences of a pattern or object within an image. It is also the 2-dimensional version of Pearson product-moment correlation coefficient.