MIDTERM-1 Questions

Sample Question 1

For the graduate students only. A time series with a periodic component can be constructed from

$$x_t = U_1 \sin(2\pi\omega t) + U_2 \cos(2\pi\omega t),$$

where $U_1$ and $U_2$ are independent random variables with zero means and $E(U_1^2) = E(U_2^2) = \sigma^2$. The constant $\omega$ determines the period or time it takes the process to make one complete cycle. Show that this series is weakly stationary with autocovariance function

$$\gamma(h) = \sigma^2 \cos(2\pi\omega h).$$

Sample Question 24

Let $\{Z_t\}$ be a sequence of independent normal random variables, each with mean $0$ and variance $\sigma^2$, and let $a$, $b$, and $c$ be constants. Which, if any, of the following processes are stationary? For each stationary process specify the mean and autocovariance function.

(a) $X_t = a + bZ_t + cZ_{t-2}$

(b) $X_t = Z_1 \cos(ct) + Z_2 \sin(ct)$

Sample Question 10

Let $\{Z_t\}$ be a zero mean $\text{WN}(\sigma^2)$ process. For each of the following parts, state whether the given process $\{X_t\}$ is stationary or not. If it is not stationary, prove it. If it is stationary, compute its ACF.

(a) $X_t = (-1)^t Z_0 + (-1)^{t+1} Z_1$.

(b) $X_t = Z_0 s_t + Z_1 s_{t-1}$, where $\{s_t\}$ is seasonal with period 2.

Sample Question 7

(a) Suppose that $\{X_t\}$ is a weakly stationary time series with $E(X_t) = 0$. Define $Y_t = (-1)^t X_t$ for any integer $t$. Is $\{Y_t\}$ a weakly stationary time series? Prove your result if your answer is yes and give a counter-example if your answer is no.

Sample Question 14

Suppose $\{s_t\}$ is a seasonal component with period $d = 2$. Show that $|\nabla^2 s_t|$ has no seasonal component, where $\nabla$ is the difference operator.

Sample Question 12

Let $\{s_t\}$ be a seasonal component with period 10 and $\{Y_t\}$ be a weakly stationary process with mean $0$ and ACVF $\gamma_Y(h)$. Let $X_t = s_t + Y_t, t \in \mathbb{Z}$.

(a) Is $\{X_t\}$ weakly stationary? If so, compute its ACVF in terms of $\gamma_Y(\cdot)$.

(b) Let $U_t = X_t - X_{t-1}$. Is $\{U_t\}$ weakly stationary? If so, compute its ACVF in terms of $\gamma_Y(\cdot)$. If not, does $\{U_t\}$ still have a seasonal component?

Sample Question 25

Let $d_1$ and $d_2$ be positive integers, with $d_1 \neq d_2$. Let $\{s_t^{(d_1)}\}$ and $\{s_t^{(d_2)}\}$ be seasonal components with periods $d_1$ and $d_2$, respectively. Let $X_t = s_t^{(d_1)} + s_t^{(d_2)}$ and let $Y_t = \nabla_{d_1} X_t$. For each of the following statements, say whether it is TRUE or FALSE. If TRUE prove the statement, and if FALSE give a counterexample.

(i) $\nabla_{d_1 d_2} X_t$ is a constant.

(ii) $\nabla_{d_1 + d_2} X_t$ is a constant.

Sample Question 26

In the following parts, $\nabla_d$ denotes the lag-$d$ difference operator.

(a) Suppose $\{X_t\}$ has 2 seasonal components, one of period 12 and one of period 28. What is the smallest $d$ such that $\{\nabla_d X_t\}$ will have no seasonal components? More generally, suppose $\{X_t\}$ has $k$ seasonal components of periods $d_1, \dots, d_k$. What is the smallest $d$ such that $\{\nabla_d X_t\}$ will have no seasonal components?

(b) Suppose $\{X_t\}$ has 2 seasonal components, one of period 5 and one of period 7, and also a quadratic polynomial trend. Give a causal filter with no more than three nonzero coefficients such that the output of the filter applied to $\{X_t\}$ will have no trend and no seasonal components.

Sample Question 2

Let $\{X_t\}$ be a zero-mean stationary process, for $t \in \mathbb{Z}$, with ACF $\gamma_X(h)$. In each of the following parts, state whether the process $\{Y_t\}$ is necessarily stationary or not. If it is not necessarily stationary prove it. If it is necessarily stationary give the ACF of $\{Y_t\}$ in terms of the ACF of $\{X_t\}$:

(a) $Y_t = (-1)^t X_t$

(b) $Y_t = X_{|t|}$

(c) $Y_t = X_{kt}$, where $k > 1$ is an integer

(d) $Y_t = X_{t^3}$

Example Problem 1.12 (a)

Suppose $\{a_j\}$ is a linear filter and suppose

$$m_t = c_0 + c_1 t + \dots + c_k t^k$$

is a polynomial trend of degree $k$. Show that the filter passes $m_t$ exactly (i.e., $m_t = \sum_{j=-\infty}^{\infty} a_j m_{t-j}$) if and only if

(i) $\sum_{j=-\infty}^{\infty} a_j = 1$

(ii) $\sum_{j=-\infty}^{\infty} j^r a_j = 0 \quad \text{for } r = 1, \dots, k$.


MIDTERM-2 Questions

Sample Question 11

Let $\{Z_t\}$ be a zero mean $\text{WN}(\sigma^2)$ process, and let $X_t = Z_t + \sum_{j=1}^{\infty} \left(\frac{1}{2}\right)^{j-1} Z_{t-j}$. Find the ACVF and the ACF of $\{X_t\}$.

Sample Question 6

Suppose that $\{X_t\}$ is a MA(2) process

$$X_t = \epsilon_t + \beta_1 \epsilon_{t-1} + \beta_2 \epsilon_{t-2}$$

where $\{\epsilon_t\}$ is a white noise process with mean $0$ and variance $\sigma^2$.

(a) If $\rho(s)$ is the autocorrelation function of $\{X_t\}$, find an expression for $\rho(1)$. For what values of $\beta_1$ and $\beta_2$ is $\rho(1) = 0$?

(b) Define $Y_t = X_t + 2X_{t-1}$. Show that $\{Y_t\}$ is also a MA process and give its spectral density function.

Sample Question 13

Let $Z_t$ be a zero-mean $wn(\sigma^2)$ process, and let $X_t = \sum_{j=0}^{N} \alpha^j Z_{t-j}$, where $\alpha$ is a nonzero constant and $N \ge 1$ is a fixed positive integer.

(a) Find the ACVF of $X_t$

(b) Find the ACF of $X_t$

Sample Question 19

Let $X_t = Y_0 s_t + Y_1 s_{t-1}$, where $\{s_t\}$ is seasonal with period 2, and $Y_t = Z_t + \theta Z_{t-1}$ is an MA(1) process with MA coefficient $\theta$, and $\{Z_t\}$ is a zero-mean $\text{WN}(\sigma^2)$ process. Compute the ACF of $\{X_t\}$.

Sample Question 16

Suppose $\{X_t\}$ and $\{Y_t\}$ are independent (i.e., $X_r$ and $Y_s$ are independent for every $r$ and $s$), $0$-mean, stationary processes with autocovariance functions $\gamma_X(h)$ and $\gamma_Y(h)$, respectively. Let $Z_t = X_t Y_t$. What is the ACVF and ACF of $\{Z_t\}$?


MIDTERM-3 is skipped for now


MIDTERM-4 Questions

0:12 押题1 - 读图题

15:55 押题9 - Exponential Smoother

13:20 押题8 - Symmetric Filter

11:47 押题7 - MA(1) Cov Matrix

8:40 押题6 - MA(1) process

7:03 押题5 - Seasonal and Filter

5:15 押题4: MA(Infinity) ACF 计算

3:42 Linear Filter

1:30 AR(2)


FINAL-1-1

1:03:15 Example 2.5.1 AR(1) One step Prediction, BLP

Example 2.5.1 — AR(1) one-step prediction (BLP)

Question. Suppose $\{X_t\}$ is a stationary AR(1) process

$$X_t = \phi X_{t-1} + Z_t, \quad |\phi| < 1, \quad \{Z_t\} \sim \mathrm{WN}(0, \sigma^2).$$

For $n \geq 1$, find the best linear predictor of $X_{n+1}$ based on $X_1, \ldots, X_n$, and compute its mean squared error.


押題 9 — Exponential smoother ACVF

Question. Consider the exponential smoother with smoothing parameter $\alpha \in (0, 1)$ and let $\{X_t\}$ be the result of applying this smoother to $\{Z_t\}$, where $\{Z_t\}$ is zero-mean $\mathrm{WN}(\sigma^2)$, i.e.

$$X_t = \sum_{i=0}^{\infty} \alpha(1-\alpha)^i Z_{t-i}.$$

Find the autocovariance function (ACVF) of $\{X_t\}$ at all lags $h$.


43:20 Sample Q10 - AR(2), Causal, PACF of AR(2) given coefficient

Sample question 10 — AR(2) causality and PACF

  1. (17 points total). Consider the AR(2) process $\{X_t\}$ satisfying
$$X_t - 4a X_{t-1} + 3a^2 X_{t-2} = Z_t, \quad Z_t \sim \mathrm{WN}(0, \sigma^2), \tag{1}$$

where $a$ is an unknown constant.

(a) (8 pts) Find the range of $a$ such that the process (1) is causal.

(b) (9 pts) If $a = 1/6$ in part (a), find the PACF $\alpha_j$ of $\{X_t\}$ for all $j \geq 1$.


41:19 Sample Q3 - AR Walker, Compute Autocorrelation Function of AR(2), compute PACF function of AR(2)

Sample question 3 — Yule–Walker for AR(2)

  1. [10 marks] Suppose that $\{X_t\}$ is an AR(2) process with
$$X_t = X_{t-1} - \frac{1}{4} X_{t-2} + \varepsilon_t,$$

where $\{\varepsilon_t\}$ is white noise with mean $0$ and variance $\sigma^2$.

(a) If $\rho(s)$ is the autocorrelation function of $\{X_t\}$, show that

$$\rho(1) = \frac{4}{5} \quad \text{and} \quad \rho(2) = \frac{11}{20}.$$

(Hint: Yule–Walker equations.)


38:08 AR(3) ACF and PACF Calculation

Sample question 8 — AR(3) ACF and PACF

  1. [10 marks] Suppose that $\{X_t\}$ is an AR(3) process with
$$X_t = \phi X_{t-3} + \varepsilon_t,$$

where $|\phi| < 1$ and $\{\varepsilon_t\}$ is white noise with mean $0$ and variance $\sigma^2$.

(a) If $\rho(s)$ is the autocorrelation function of $\{X_t\}$, show that

$$\rho(1) = \rho(2) = 0, \qquad \rho(3) = \phi.$$

(Hint: Yule–Walker equations.)

(b) Let $\pi(s)$ be the partial autocorrelation function of $\{X_t\}$. Evaluate $\pi(1)$, $\pi(2)$, and $\pi(3)$.


32:50 PACF standard computation procedures

Sample question 7 — MA(2) ACF and a derived process

  1. [10 marks] Suppose that $\{X_t\}$ is an MA(2) process
$$X_t = \varepsilon_t + \beta_1 \varepsilon_{t-1} + \beta_2 \varepsilon_{t-2},$$

where $\{\varepsilon_t\}$ is white noise with mean $0$ and variance $\sigma^2$.

(a) If $\rho(s)$ is the autocorrelation function of $\{X_t\}$, find an expression for $\rho(1)$. For what values of $\beta_1$ and $\beta_2$ is $\rho(1) = 0$?

(b) Define $Y_t = X_t + 2 X_{t-1}$. Show that $\{Y_t\}$ is also an MA process and give its spectral density function.


25:35 Sample Q6 - ARMA

Sample question 6 — Classify ARMA models

  1. [10 marks] Consider the following five ARMA processes; in each case, $\{\varepsilon_t\}$ is zero-mean white noise with $\mathrm{Cov}(\varepsilon_t, X_{t-s}) = 0$ for $s \geq 1$.

(1) $X_t = 0.9 X_{t-1} + \varepsilon_t - 0.9 \varepsilon_{t-1}$

(2) $X_t = 1.5 X_{t-1} - 0.56 X_{t-2} + \varepsilon_t + 1.5 \varepsilon_{t-1} + 0.56 \varepsilon_{t-2}$

(3) $X_t = 0.7 X_{t-1} - 0.7 X_{t-2} + \varepsilon_t - 0.5 \varepsilon_{t-1}$

(4) $X_t = 1.8 X_{t-1} - 0.8 X_{t-2} + \varepsilon_t + 0.5 \varepsilon_{t-1}$

(5) $X_t = 1.5 X_{t-1} - 0.75 X_{t-2} + 0.125 X_{t-4} + \varepsilon_t + 1.5 \varepsilon_{t-1} + 0.75 \varepsilon_{t-2} + 0.125 \varepsilon_{t-3}$

(a) Which of these processes has an autoregressive component with a unit root? Justify your answer.

(b) Which of these processes is white noise? Justify your answer.


22:06 ACF/PACF for ARMA models Graph Plotting

17:15 Sample Q2 - ARMA characteristic / stationary and invertible criteria / MA(infinity) truncated at lag 4’s series

Sample question 2 — ARMA(2,1) roots, stationarity, invertibility

Consider the ARMA(2,1) model

$$x_t = 1.3 x_{t-1} - 0.8 x_{t-2} + w_t + 1.2 w_{t-1}, \quad w_t \sim \mathrm{WN}(0, \sigma^2).$$

(a) Find the roots of the AR and MA characteristic equations.

(b) Determine whether this process is stationary and invertible. Explain.


FINAL-1-2

0:48 P2Q1 - Seasonality, Difference Operator

4:07 P2Q2 - Prediction -> Best Linar Predictor P(x1 | x2,x3 )

8:07 P2Q3 - Best linear predictor + prediction error MSE computation

15:52 P2SampleQ15 - ACF


Appendix 1: Exam Instructions

Format:

No external aids (no cheat-sheet, calculator, etc.)

The exam will include topics from before the midterm, and many of this course's topics tend build on one another, so it's a good idea to review the foundations (e.g. classical decomposition, stationarity) when studying.

The exam does NOT cover: the innovations algorithm, nor any of the new material from week 13 (e.g. music data, power spectra, fourier transforms)

This is the formula sheet that will be attached to the exam. Note that it gives the formulas for best LPs and their MSEs, but you will have to know what symbols like mean in context.

Preparing Short Answer Responses

Prepare to describe, in a few sentences each, some of the broader concepts we’ve discussed in time series. You should use technical language where necessary (these are not coffee dates). To get a picture of the difficulty and level of detail expected for this task, please review the short-answer problems/solutions featured in Midterm Solutions for Studying.

For the most well-rounded studying experience, I recommend practicing how to discuss the following topics (not all of them will be on the test, but they do connect to form a bigger picture that you should be comfortable with):

  • Classical Decomposition
  • Trend estimation/elimination
  • Seasonal component estimation/elimination
  • Multi-step modelling techniques (e.g. S1 method, or combining trend/seasonal estimation techniques like we do in workshop)
  • White Noise hypothesis testing
  • ARMA Modelling
  • Estimating parameters like
  • Model selection (e.g. choosing orders p and q like we did in workshop 5)
  • Recognizing MA and AR structures (or lack thereof) using sample ACF and sample PACF plots
  • Residual analysis (see workshop 5)
  • Forecasting
  • Training/Testing frameworks
  • Comparing plots for different forecasting models (e.g. looking at accuracy, prediction intervals)
  • Finding best Linear Predictors, minimum mean squared error, prediction intervals
  • Plot interpretation
  • ACVFs, ACFs, PACFs - what do these plots tell us?
  • Residual plots (see workshop 5)
  • Time plots (as in, the actual data plotted against time)
  • Forecasting results, prediction intervals