2.1.2 Probabilistic Averages and Stationarity
In order to be able to perform a study of the system, the randomness factor has to
be overcome: the expected value operator E[.] performs a statistical average on the
random process, producing the following moments: the first-order moment of the random
process called mean (2.3), and its second-order moment called autocorrelation (2.4).
+0o
m(t)= EX(t)]= x f(x,t)dx (2.3)
-co00
R(T,t)= E[X(t)X(t + T)] = J xlX2 f(xix2,'T)dxidx2 (2.4)
A major goal in time series analysis is to be able to estimate the unknown structure
of the process by using the available data. This model can only be valid to perform
prediction on future data if the process is stable, or stationary. A process is said to be
stationary if its statistical properties do not change over time, i.e., for any time delay c, it
verifies (2.5), or more precisely, any change of origin in the time domain does not affect
any of the nth order cumulative density function.
F(xi, x2,..., xn, ti, t2,...tn) = F(xi, x2,..., xn, ti + c, t2 + c,...tn + c) (2.5)
That implies that the first-order statistics (mean) are independent of time, and that
second-order statistics (autocorrelation), as well as higher order-ones, depend only on the
time difference T but not on absolute time. This definition is very restraining, and
therefore a broader sense of stationarity is defined focusing only on the first two order
moments: wide-sense stationarity (WSS) or weak stationarity [1]. A process is said to be
WSS if its mean is constant and its autocorrelation depends only on the time difference.