Term | Description |
---|---|

1/f noise | A signal with a frequency content such that its power spectral density S(f) is proportional to a small power α of the reciprocal of the frequency, i.e., S(f) ≈ 1/f^{α}, where the exponent α satisfies 0 ≤ α ≤ 3. Such signals are characteristic of complex processes without a preferred timescale, e.g., long-range-dependent signals. |

Continuous-time Markov chain | A stochastic process X(t) in continuous time for which the conditional probability distribution at time t depends only on a finite number n of values X(s_{1}), X(s_{2}), …, X(s) from its past, _{n}s_{1} < s_{2} < … < s < _{n}t. Such a process is also called a process with finite memory. The special case where n = 0 is a memoryless process. |

Eigenvalue | Principal component analysis decomposes the covariance matrix of a multivariate data set into an ordered set of eigenvectors along which the data varies the most. The corresponding eigenvalues represent the variance of the data explained by each eigenvector. |

Eigenvector coefficients | The components of a principal mode (or eigenmode), i.e., of an eigenvector of the covariance matrix in principal component analysis. |

Hilbert transform | A tool from signal processing that shifts the phase factors of a bandpass-filtered signal. Adding the Hilbert transform as imaginary part to the original signal results in an analytic signal, from which the instantaneous phase and the envelope of the signal can be calculated. |

Hurst exponent | The Hurst exponent H (where 1/2 ≤ H ≤ 1) quantifies the self-similarity of a signal. A stochastic process X(t) is called self-similar if X(at) has the same probability distribution as a(^{H}Xt) for each a > 0. Long-range-dependent signals (e.g., arising from a process with a 1/f power spectral density) exhibit values of H close to 1. |

Long-range dependence | A stochastic process is long-range dependent if its autocorrelation function R(τ) falls off slower than exponentially for large values of the lag τ. Such processes have infinite memory and typically give rise to power spectral densities with a 1/f distribution. |

Power spectral density | The power spectral density S(f) is the Fourier transform of the autocorrelation function of a stochastic process X(t). It is closely related to the Fourier transform of the signal (which does not exist if X(t) is long-range dependent) and quantifies the frequency content of X(t), i.e., S(f) is the contribution to the variance (or ″power″) of X(t) at the frequency f. |

Projection weight | The scalar product of the original signal with an eigenvector of the covariance matrix in principal component analysis, i.e., the coefficient the eigenvector has when the signal is written as a linear combination of all eigenvectors. This quantifies the amount the eigenvector in question contributes to the signal. |

Wavelet decomposition | A time–frequency representation of a signal X(t) that quantifies its local frequency content, similar to windowed Fourier analysis. The signal X(t) is convolved with a family of short test signals that are rescaled and translated copies of a single ″mother″ signal (wavelet). |