2. Moving Average (MA) Model
ECO374H1
Department of Economics
Summer 2025
White Noise Process
We will construct time series models out of building blocks
The simplest building block is the white noise process, {εt}, where each εt is defined as an independent random shock with E [εt] = 0 and Var (εt) = σε(2) for each t , and
ρk = 0 for k ≥ 1
rk = 0 for k ≥ 1
i.e. both ACF and PACF are zero for all lags I {εt} is a covariance stationary process
See R file 2a. White Noise Simulation for simulated draws of the white noise process
Wold Decomposition Theorem
The white noise process is formally incorporated in linear time series models based on the following theorem, which implies that every covariance stationary stochastic process {Yt} can be written as the sum of two time series, one deterministic and one stochastic:
Theorem (Wold Decomposition Theorem)
If {Yt } is a covariance stationary process and {εt} is a white noise zero-mean process, then there exists a unique linear representation
where Vt
is a deterministic component and is the stochastic component with ψ0 = 1, and .
Model Components
In the decomposition above, the sequence fεtg is called random shocks or innovations
Since , there must be a j from which all subsequent ψj+1, ψj+2, ... are getting smaller such that the corresponding innovations ε t -(j +1), ε t -(j +2), ... have a negligible effect on Yt
The deterministic component Vt can include a trend or cycle
In this Section we will assume Vt = 0, and return back to it with a full model in future Sections
Lag Operator Representation
We can write the Wold decomposition in terms of the lag operator L :
where we define the composite lag operator
Moving Average
We can approximate (1) with the model
called the Moving Average of order q, denoted by MA(q)
In practice, we will seek to have a good approximation of the dynamics in Yt with few parameters, for a small q