Markov switching autoregression models

This notebook provides an example of the use of Markov switching models in statsmodels to replicate a number of results presented in Kim and Nelson (1999). It applies the Hamilton (1989) filter the Kim (1994) smoother.

This is tested against the Markov-switching models from E-views 8, which can be found at http://www.eviews.com/EViews8/ev8ecswitch_n.html#MarkovAR or the Markov-switching models of Stata 14 which can be found at http://www.stata.com/manuals14/tsmswitch.pdf.

[1]:
%matplotlib inline

import numpy as np
import pandas as pd
import statsmodels.api as sm
import matplotlib.pyplot as plt
import requests
from io import BytesIO

# NBER recessions
from pandas_datareader.data import DataReader
from datetime import datetime
usrec = DataReader('USREC', 'fred', start=datetime(1947, 1, 1), end=datetime(2013, 4, 1))
---------------------------------------------------------------------------
ModuleNotFoundError                       Traceback (most recent call last)
<ipython-input-1-3fd38b6bfa68> in <module>
      9
     10 # NBER recessions
---> 11 from pandas_datareader.data import DataReader
     12 from datetime import datetime
     13 usrec = DataReader('USREC', 'fred', start=datetime(1947, 1, 1), end=datetime(2013, 4, 1))

ModuleNotFoundError: No module named 'pandas_datareader'

Hamilton (1989) switching model of GNP

This replicates Hamilton’s (1989) seminal paper introducing Markov-switching models. The model is an autoregressive model of order 4 in which the mean of the process switches between two regimes. It can be written:

\[y_t = \mu_{S_t} + \phi_1 (y_{t-1} - \mu_{S_{t-1}}) + \phi_2 (y_{t-2} - \mu_{S_{t-2}}) + \phi_3 (y_{t-3} - \mu_{S_{t-3}}) + \phi_4 (y_{t-4} - \mu_{S_{t-4}}) + \varepsilon_t\]

Each period, the regime transitions according to the following matrix of transition probabilities:

\[\begin{split} P(S_t = s_t | S_{t-1} = s_{t-1}) = \begin{bmatrix} p_{00} & p_{10} \\ p_{01} & p_{11} \end{bmatrix}\end{split}\]

where \(p_{ij}\) is the probability of transitioning from regime \(i\), to regime \(j\).

The model class is MarkovAutoregression in the time-series part of statsmodels. In order to create the model, we must specify the number of regimes with k_regimes=2, and the order of the autoregression with order=4. The default model also includes switching autoregressive coefficients, so here we also need to specify switching_ar=False to avoid that.

After creation, the model is fit via maximum likelihood estimation. Under the hood, good starting parameters are found using a number of steps of the expectation maximization (EM) algorithm, and a quasi-Newton (BFGS) algorithm is applied to quickly find the maximum.

[2]:
# Get the RGNP data to replicate Hamilton
dta = pd.read_stata('https://www.stata-press.com/data/r14/rgnp.dta').iloc[1:]
dta.index = pd.DatetimeIndex(dta.date, freq='QS')
dta_hamilton = dta.rgnp

# Plot the data
dta_hamilton.plot(title='Growth rate of Real GNP', figsize=(12,3))

# Fit the model
mod_hamilton = sm.tsa.MarkovAutoregression(dta_hamilton, k_regimes=2, order=4, switching_ar=False)
res_hamilton = mod_hamilton.fit()
---------------------------------------------------------------------------
ConnectionRefusedError                    Traceback (most recent call last)
/usr/lib/python3.8/urllib/request.py in do_open(self, http_class, req, **http_conn_args)
   1349             try:
-> 1350                 h.request(req.get_method(), req.selector, req.data, headers,
   1351                           encode_chunked=req.has_header('Transfer-encoding'))

/usr/lib/python3.8/http/client.py in request(self, method, url, body, headers, encode_chunked)
   1254         """Send a complete request to the server."""
-> 1255         self._send_request(method, url, body, headers, encode_chunked)
   1256

/usr/lib/python3.8/http/client.py in _send_request(self, method, url, body, headers, encode_chunked)
   1300             body = _encode(body, 'body')
-> 1301         self.endheaders(body, encode_chunked=encode_chunked)
   1302

/usr/lib/python3.8/http/client.py in endheaders(self, message_body, encode_chunked)
   1249             raise CannotSendHeader()
-> 1250         self._send_output(message_body, encode_chunked=encode_chunked)
   1251

/usr/lib/python3.8/http/client.py in _send_output(self, message_body, encode_chunked)
   1009         del self._buffer[:]
-> 1010         self.send(msg)
   1011

/usr/lib/python3.8/http/client.py in send(self, data)
    949             if self.auto_open:
--> 950                 self.connect()
    951             else:

/usr/lib/python3.8/http/client.py in connect(self)
   1416
-> 1417             super().connect()
   1418

/usr/lib/python3.8/http/client.py in connect(self)
    920         """Connect to the host and port specified in __init__."""
--> 921         self.sock = self._create_connection(
    922             (self.host,self.port), self.timeout, self.source_address)

/usr/lib/python3.8/socket.py in create_connection(address, timeout, source_address)
    807         try:
--> 808             raise err
    809         finally:

/usr/lib/python3.8/socket.py in create_connection(address, timeout, source_address)
    795                 sock.bind(source_address)
--> 796             sock.connect(sa)
    797             # Break explicitly a reference cycle

ConnectionRefusedError: [Errno 111] Connection refused

During handling of the above exception, another exception occurred:

URLError                                  Traceback (most recent call last)
<ipython-input-2-7ccef5c6c7dd> in <module>
      1 # Get the RGNP data to replicate Hamilton
----> 2 dta = pd.read_stata('https://www.stata-press.com/data/r14/rgnp.dta').iloc[1:]
      3 dta.index = pd.DatetimeIndex(dta.date, freq='QS')
      4 dta_hamilton = dta.rgnp
      5

/usr/lib/python3/dist-packages/pandas/util/_decorators.py in wrapper(*args, **kwargs)
    206                 else:
    207                     kwargs[new_arg_name] = new_arg_value
--> 208             return func(*args, **kwargs)
    209
    210         return wrapper

/usr/lib/python3/dist-packages/pandas/util/_decorators.py in wrapper(*args, **kwargs)
    206                 else:
    207                     kwargs[new_arg_name] = new_arg_value
--> 208             return func(*args, **kwargs)
    209
    210         return wrapper

/usr/lib/python3/dist-packages/pandas/io/stata.py in read_stata(filepath_or_buffer, convert_dates, convert_categoricals, encoding, index_col, convert_missing, preserve_dtypes, columns, order_categoricals, chunksize, iterator)
    219 ):
    220
--> 221     reader = StataReader(
    222         filepath_or_buffer,
    223         convert_dates=convert_dates,

/usr/lib/python3/dist-packages/pandas/util/_decorators.py in wrapper(*args, **kwargs)
    206                 else:
    207                     kwargs[new_arg_name] = new_arg_value
--> 208             return func(*args, **kwargs)
    209
    210         return wrapper

/usr/lib/python3/dist-packages/pandas/util/_decorators.py in wrapper(*args, **kwargs)
    206                 else:
    207                     kwargs[new_arg_name] = new_arg_value
--> 208             return func(*args, **kwargs)
    209
    210         return wrapper

/usr/lib/python3/dist-packages/pandas/io/stata.py in __init__(self, path_or_buf, convert_dates, convert_categoricals, index_col, convert_missing, preserve_dtypes, columns, order_categoricals, encoding, chunksize)
   1093         path_or_buf = _stringify_path(path_or_buf)
   1094         if isinstance(path_or_buf, str):
-> 1095             path_or_buf, encoding, _, should_close = get_filepath_or_buffer(path_or_buf)
   1096
   1097         if isinstance(path_or_buf, (str, bytes)):

/usr/lib/python3/dist-packages/pandas/io/common.py in get_filepath_or_buffer(filepath_or_buffer, encoding, compression, mode)
    194
    195     if _is_url(filepath_or_buffer):
--> 196         req = urlopen(filepath_or_buffer)
    197         content_encoding = req.headers.get("Content-Encoding", None)
    198         if content_encoding == "gzip":

/usr/lib/python3.8/urllib/request.py in urlopen(url, data, timeout, cafile, capath, cadefault, context)
    220     else:
    221         opener = _opener
--> 222     return opener.open(url, data, timeout)
    223
    224 def install_opener(opener):

/usr/lib/python3.8/urllib/request.py in open(self, fullurl, data, timeout)
    523
    524         sys.audit('urllib.Request', req.full_url, req.data, req.headers, req.get_method())
--> 525         response = self._open(req, data)
    526
    527         # post-process response

/usr/lib/python3.8/urllib/request.py in _open(self, req, data)
    540
    541         protocol = req.type
--> 542         result = self._call_chain(self.handle_open, protocol, protocol +
    543                                   '_open', req)
    544         if result:

/usr/lib/python3.8/urllib/request.py in _call_chain(self, chain, kind, meth_name, *args)
    500         for handler in handlers:
    501             func = getattr(handler, meth_name)
--> 502             result = func(*args)
    503             if result is not None:
    504                 return result

/usr/lib/python3.8/urllib/request.py in https_open(self, req)
   1391
   1392         def https_open(self, req):
-> 1393             return self.do_open(http.client.HTTPSConnection, req,
   1394                 context=self._context, check_hostname=self._check_hostname)
   1395

/usr/lib/python3.8/urllib/request.py in do_open(self, http_class, req, **http_conn_args)
   1351                           encode_chunked=req.has_header('Transfer-encoding'))
   1352             except OSError as err: # timeout error
-> 1353                 raise URLError(err)
   1354             r = h.getresponse()
   1355         except:

URLError: <urlopen error [Errno 111] Connection refused>
[3]:
res_hamilton.summary()
---------------------------------------------------------------------------
NameError                                 Traceback (most recent call last)
<ipython-input-3-37be29eb8e24> in <module>
----> 1 res_hamilton.summary()

NameError: name 'res_hamilton' is not defined

We plot the filtered and smoothed probabilities of a recession. Filtered refers to an estimate of the probability at time \(t\) based on data up to and including time \(t\) (but excluding time \(t+1, ..., T\)). Smoothed refers to an estimate of the probability at time \(t\) using all the data in the sample.

For reference, the shaded periods represent the NBER recessions.

[4]:
fig, axes = plt.subplots(2, figsize=(7,7))
ax = axes[0]
ax.plot(res_hamilton.filtered_marginal_probabilities[0])
ax.fill_between(usrec.index, 0, 1, where=usrec['USREC'].values, color='k', alpha=0.1)
ax.set_xlim(dta_hamilton.index[4], dta_hamilton.index[-1])
ax.set(title='Filtered probability of recession')

ax = axes[1]
ax.plot(res_hamilton.smoothed_marginal_probabilities[0])
ax.fill_between(usrec.index, 0, 1, where=usrec['USREC'].values, color='k', alpha=0.1)
ax.set_xlim(dta_hamilton.index[4], dta_hamilton.index[-1])
ax.set(title='Smoothed probability of recession')

fig.tight_layout()
---------------------------------------------------------------------------
NameError                                 Traceback (most recent call last)
<ipython-input-4-9b61339d54f4> in <module>
      1 fig, axes = plt.subplots(2, figsize=(7,7))
      2 ax = axes[0]
----> 3 ax.plot(res_hamilton.filtered_marginal_probabilities[0])
      4 ax.fill_between(usrec.index, 0, 1, where=usrec['USREC'].values, color='k', alpha=0.1)
      5 ax.set_xlim(dta_hamilton.index[4], dta_hamilton.index[-1])

NameError: name 'res_hamilton' is not defined
../../../_images/examples_notebooks_generated_markov_autoregression_7_1.png

From the estimated transition matrix we can calculate the expected duration of a recession versus an expansion.

[5]:
print(res_hamilton.expected_durations)
---------------------------------------------------------------------------
NameError                                 Traceback (most recent call last)
<ipython-input-5-517c0129988b> in <module>
----> 1 print(res_hamilton.expected_durations)

NameError: name 'res_hamilton' is not defined

In this case, it is expected that a recession will last about one year (4 quarters) and an expansion about two and a half years.

Kim, Nelson, and Startz (1998) Three-state Variance Switching

This model demonstrates estimation with regime heteroskedasticity (switching of variances) and no mean effect. The dataset can be reached at http://econ.korea.ac.kr/~cjkim/MARKOV/data/ew_excs.prn.

The model in question is:

\[\begin{split}\begin{align} y_t & = \varepsilon_t \\ \varepsilon_t & \sim N(0, \sigma_{S_t}^2) \end{align}\end{split}\]

Since there is no autoregressive component, this model can be fit using the MarkovRegression class. Since there is no mean effect, we specify trend='nc'. There are hypothesized to be three regimes for the switching variances, so we specify k_regimes=3 and switching_variance=True (by default, the variance is assumed to be the same across regimes).

[6]:
# Get the dataset
ew_excs = requests.get('http://econ.korea.ac.kr/~cjkim/MARKOV/data/ew_excs.prn').content
raw = pd.read_table(BytesIO(ew_excs), header=None, skipfooter=1, engine='python')
raw.index = pd.date_range('1926-01-01', '1995-12-01', freq='MS')

dta_kns = raw.loc[:'1986'] - raw.loc[:'1986'].mean()

# Plot the dataset
dta_kns[0].plot(title='Excess returns', figsize=(12, 3))

# Fit the model
mod_kns = sm.tsa.MarkovRegression(dta_kns, k_regimes=3, trend='nc', switching_variance=True)
res_kns = mod_kns.fit()
---------------------------------------------------------------------------
ConnectionRefusedError                    Traceback (most recent call last)
/usr/lib/python3/dist-packages/urllib3/connection.py in _new_conn(self)
    158         try:
--> 159             conn = connection.create_connection(
    160                 (self._dns_host, self.port), self.timeout, **extra_kw

/usr/lib/python3/dist-packages/urllib3/util/connection.py in create_connection(address, timeout, source_address, socket_options)
     83     if err is not None:
---> 84         raise err
     85

/usr/lib/python3/dist-packages/urllib3/util/connection.py in create_connection(address, timeout, source_address, socket_options)
     73                 sock.bind(source_address)
---> 74             sock.connect(sa)
     75             return sock

ConnectionRefusedError: [Errno 111] Connection refused

During handling of the above exception, another exception occurred:

NewConnectionError                        Traceback (most recent call last)
/usr/lib/python3/dist-packages/urllib3/connectionpool.py in urlopen(self, method, url, body, headers, retries, redirect, assert_same_host, timeout, pool_timeout, release_conn, chunked, body_pos, **response_kw)
    669             # Make the request on the httplib connection object.
--> 670             httplib_response = self._make_request(
    671                 conn,

/usr/lib/python3/dist-packages/urllib3/connectionpool.py in _make_request(self, conn, method, url, timeout, chunked, **httplib_request_kw)
    391         else:
--> 392             conn.request(method, url, **httplib_request_kw)
    393

/usr/lib/python3.8/http/client.py in request(self, method, url, body, headers, encode_chunked)
   1254         """Send a complete request to the server."""
-> 1255         self._send_request(method, url, body, headers, encode_chunked)
   1256

/usr/lib/python3.8/http/client.py in _send_request(self, method, url, body, headers, encode_chunked)
   1300             body = _encode(body, 'body')
-> 1301         self.endheaders(body, encode_chunked=encode_chunked)
   1302

/usr/lib/python3.8/http/client.py in endheaders(self, message_body, encode_chunked)
   1249             raise CannotSendHeader()
-> 1250         self._send_output(message_body, encode_chunked=encode_chunked)
   1251

/usr/lib/python3.8/http/client.py in _send_output(self, message_body, encode_chunked)
   1009         del self._buffer[:]
-> 1010         self.send(msg)
   1011

/usr/lib/python3.8/http/client.py in send(self, data)
    949             if self.auto_open:
--> 950                 self.connect()
    951             else:

/usr/lib/python3/dist-packages/urllib3/connection.py in connect(self)
    186     def connect(self):
--> 187         conn = self._new_conn()
    188         self._prepare_conn(conn)

/usr/lib/python3/dist-packages/urllib3/connection.py in _new_conn(self)
    170         except SocketError as e:
--> 171             raise NewConnectionError(
    172                 self, "Failed to establish a new connection: %s" % e

NewConnectionError: <urllib3.connection.HTTPConnection object at 0x7f7538f9bf40>: Failed to establish a new connection: [Errno 111] Connection refused

During handling of the above exception, another exception occurred:

MaxRetryError                             Traceback (most recent call last)
/usr/lib/python3/dist-packages/requests/adapters.py in send(self, request, stream, timeout, verify, cert, proxies)
    438             if not chunked:
--> 439                 resp = conn.urlopen(
    440                     method=request.method,

/usr/lib/python3/dist-packages/urllib3/connectionpool.py in urlopen(self, method, url, body, headers, retries, redirect, assert_same_host, timeout, pool_timeout, release_conn, chunked, body_pos, **response_kw)
    723
--> 724             retries = retries.increment(
    725                 method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2]

/usr/lib/python3/dist-packages/urllib3/util/retry.py in increment(self, method, url, response, error, _pool, _stacktrace)
    438         if new_retry.is_exhausted():
--> 439             raise MaxRetryError(_pool, url, error or ResponseError(cause))
    440

MaxRetryError: HTTPConnectionPool(host='127.0.0.1', port=9): Max retries exceeded with url: http://econ.korea.ac.kr/~cjkim/MARKOV/data/ew_excs.prn (Caused by ProxyError('Cannot connect to proxy.', NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f7538f9bf40>: Failed to establish a new connection: [Errno 111] Connection refused')))

During handling of the above exception, another exception occurred:

ProxyError                                Traceback (most recent call last)
<ipython-input-6-aec9dc729724> in <module>
      1 # Get the dataset
----> 2 ew_excs = requests.get('http://econ.korea.ac.kr/~cjkim/MARKOV/data/ew_excs.prn').content
      3 raw = pd.read_table(BytesIO(ew_excs), header=None, skipfooter=1, engine='python')
      4 raw.index = pd.date_range('1926-01-01', '1995-12-01', freq='MS')
      5

/usr/lib/python3/dist-packages/requests/api.py in get(url, params, **kwargs)
     74
     75     kwargs.setdefault('allow_redirects', True)
---> 76     return request('get', url, params=params, **kwargs)
     77
     78

/usr/lib/python3/dist-packages/requests/api.py in request(method, url, **kwargs)
     59     # cases, and look like a memory leak in others.
     60     with sessions.Session() as session:
---> 61         return session.request(method=method, url=url, **kwargs)
     62
     63

/usr/lib/python3/dist-packages/requests/sessions.py in request(self, method, url, params, data, headers, cookies, files, auth, timeout, allow_redirects, proxies, hooks, stream, verify, cert, json)
    528         }
    529         send_kwargs.update(settings)
--> 530         resp = self.send(prep, **send_kwargs)
    531
    532         return resp

/usr/lib/python3/dist-packages/requests/sessions.py in send(self, request, **kwargs)
    641
    642         # Send the request
--> 643         r = adapter.send(request, **kwargs)
    644
    645         # Total elapsed time of the request (approximately)

/usr/lib/python3/dist-packages/requests/adapters.py in send(self, request, stream, timeout, verify, cert, proxies)
    508
    509             if isinstance(e.reason, _ProxyError):
--> 510                 raise ProxyError(e, request=request)
    511
    512             if isinstance(e.reason, _SSLError):

ProxyError: HTTPConnectionPool(host='127.0.0.1', port=9): Max retries exceeded with url: http://econ.korea.ac.kr/~cjkim/MARKOV/data/ew_excs.prn (Caused by ProxyError('Cannot connect to proxy.', NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f7538f9bf40>: Failed to establish a new connection: [Errno 111] Connection refused')))
[7]:
res_kns.summary()
---------------------------------------------------------------------------
NameError                                 Traceback (most recent call last)
<ipython-input-7-e40786696a94> in <module>
----> 1 res_kns.summary()

NameError: name 'res_kns' is not defined

Below we plot the probabilities of being in each of the regimes; only in a few periods is a high-variance regime probable.

[8]:
fig, axes = plt.subplots(3, figsize=(10,7))

ax = axes[0]
ax.plot(res_kns.smoothed_marginal_probabilities[0])
ax.set(title='Smoothed probability of a low-variance regime for stock returns')

ax = axes[1]
ax.plot(res_kns.smoothed_marginal_probabilities[1])
ax.set(title='Smoothed probability of a medium-variance regime for stock returns')

ax = axes[2]
ax.plot(res_kns.smoothed_marginal_probabilities[2])
ax.set(title='Smoothed probability of a high-variance regime for stock returns')

fig.tight_layout()
---------------------------------------------------------------------------
NameError                                 Traceback (most recent call last)
<ipython-input-8-0cf0c0fb70ab> in <module>
      2
      3 ax = axes[0]
----> 4 ax.plot(res_kns.smoothed_marginal_probabilities[0])
      5 ax.set(title='Smoothed probability of a low-variance regime for stock returns')
      6

NameError: name 'res_kns' is not defined
../../../_images/examples_notebooks_generated_markov_autoregression_15_1.png

Filardo (1994) Time-Varying Transition Probabilities

This model demonstrates estimation with time-varying transition probabilities. The dataset can be reached at http://econ.korea.ac.kr/~cjkim/MARKOV/data/filardo.prn.

In the above models we have assumed that the transition probabilities are constant across time. Here we allow the probabilities to change with the state of the economy. Otherwise, the model is the same Markov autoregression of Hamilton (1989).

Each period, the regime now transitions according to the following matrix of time-varying transition probabilities:

\[\begin{split} P(S_t = s_t | S_{t-1} = s_{t-1}) = \begin{bmatrix} p_{00,t} & p_{10,t} \\ p_{01,t} & p_{11,t} \end{bmatrix}\end{split}\]

where \(p_{ij,t}\) is the probability of transitioning from regime \(i\), to regime \(j\) in period \(t\), and is defined to be:

\[p_{ij,t} = \frac{\exp\{ x_{t-1}' \beta_{ij} \}}{1 + \exp\{ x_{t-1}' \beta_{ij} \}}\]

Instead of estimating the transition probabilities as part of maximum likelihood, the regression coefficients \(\beta_{ij}\) are estimated. These coefficients relate the transition probabilities to a vector of pre-determined or exogenous regressors \(x_{t-1}\).

[9]:
# Get the dataset
filardo = requests.get('http://econ.korea.ac.kr/~cjkim/MARKOV/data/filardo.prn').content
dta_filardo = pd.read_table(BytesIO(filardo), sep=' +', header=None, skipfooter=1, engine='python')
dta_filardo.columns = ['month', 'ip', 'leading']
dta_filardo.index = pd.date_range('1948-01-01', '1991-04-01', freq='MS')

dta_filardo['dlip'] = np.log(dta_filardo['ip']).diff()*100
# Deflated pre-1960 observations by ratio of std. devs.
# See hmt_tvp.opt or Filardo (1994) p. 302
std_ratio = dta_filardo['dlip']['1960-01-01':].std() / dta_filardo['dlip'][:'1959-12-01'].std()
dta_filardo['dlip'][:'1959-12-01'] = dta_filardo['dlip'][:'1959-12-01'] * std_ratio

dta_filardo['dlleading'] = np.log(dta_filardo['leading']).diff()*100
dta_filardo['dmdlleading'] = dta_filardo['dlleading'] - dta_filardo['dlleading'].mean()

# Plot the data
dta_filardo['dlip'].plot(title='Standardized growth rate of industrial production', figsize=(13,3))
plt.figure()
dta_filardo['dmdlleading'].plot(title='Leading indicator', figsize=(13,3));
---------------------------------------------------------------------------
ConnectionRefusedError                    Traceback (most recent call last)
/usr/lib/python3/dist-packages/urllib3/connection.py in _new_conn(self)
    158         try:
--> 159             conn = connection.create_connection(
    160                 (self._dns_host, self.port), self.timeout, **extra_kw

/usr/lib/python3/dist-packages/urllib3/util/connection.py in create_connection(address, timeout, source_address, socket_options)
     83     if err is not None:
---> 84         raise err
     85

/usr/lib/python3/dist-packages/urllib3/util/connection.py in create_connection(address, timeout, source_address, socket_options)
     73                 sock.bind(source_address)
---> 74             sock.connect(sa)
     75             return sock

ConnectionRefusedError: [Errno 111] Connection refused

During handling of the above exception, another exception occurred:

NewConnectionError                        Traceback (most recent call last)
/usr/lib/python3/dist-packages/urllib3/connectionpool.py in urlopen(self, method, url, body, headers, retries, redirect, assert_same_host, timeout, pool_timeout, release_conn, chunked, body_pos, **response_kw)
    669             # Make the request on the httplib connection object.
--> 670             httplib_response = self._make_request(
    671                 conn,

/usr/lib/python3/dist-packages/urllib3/connectionpool.py in _make_request(self, conn, method, url, timeout, chunked, **httplib_request_kw)
    391         else:
--> 392             conn.request(method, url, **httplib_request_kw)
    393

/usr/lib/python3.8/http/client.py in request(self, method, url, body, headers, encode_chunked)
   1254         """Send a complete request to the server."""
-> 1255         self._send_request(method, url, body, headers, encode_chunked)
   1256

/usr/lib/python3.8/http/client.py in _send_request(self, method, url, body, headers, encode_chunked)
   1300             body = _encode(body, 'body')
-> 1301         self.endheaders(body, encode_chunked=encode_chunked)
   1302

/usr/lib/python3.8/http/client.py in endheaders(self, message_body, encode_chunked)
   1249             raise CannotSendHeader()
-> 1250         self._send_output(message_body, encode_chunked=encode_chunked)
   1251

/usr/lib/python3.8/http/client.py in _send_output(self, message_body, encode_chunked)
   1009         del self._buffer[:]
-> 1010         self.send(msg)
   1011

/usr/lib/python3.8/http/client.py in send(self, data)
    949             if self.auto_open:
--> 950                 self.connect()
    951             else:

/usr/lib/python3/dist-packages/urllib3/connection.py in connect(self)
    186     def connect(self):
--> 187         conn = self._new_conn()
    188         self._prepare_conn(conn)

/usr/lib/python3/dist-packages/urllib3/connection.py in _new_conn(self)
    170         except SocketError as e:
--> 171             raise NewConnectionError(
    172                 self, "Failed to establish a new connection: %s" % e

NewConnectionError: <urllib3.connection.HTTPConnection object at 0x7f7538df9370>: Failed to establish a new connection: [Errno 111] Connection refused

During handling of the above exception, another exception occurred:

MaxRetryError                             Traceback (most recent call last)
/usr/lib/python3/dist-packages/requests/adapters.py in send(self, request, stream, timeout, verify, cert, proxies)
    438             if not chunked:
--> 439                 resp = conn.urlopen(
    440                     method=request.method,

/usr/lib/python3/dist-packages/urllib3/connectionpool.py in urlopen(self, method, url, body, headers, retries, redirect, assert_same_host, timeout, pool_timeout, release_conn, chunked, body_pos, **response_kw)
    723
--> 724             retries = retries.increment(
    725                 method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2]

/usr/lib/python3/dist-packages/urllib3/util/retry.py in increment(self, method, url, response, error, _pool, _stacktrace)
    438         if new_retry.is_exhausted():
--> 439             raise MaxRetryError(_pool, url, error or ResponseError(cause))
    440

MaxRetryError: HTTPConnectionPool(host='127.0.0.1', port=9): Max retries exceeded with url: http://econ.korea.ac.kr/~cjkim/MARKOV/data/filardo.prn (Caused by ProxyError('Cannot connect to proxy.', NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f7538df9370>: Failed to establish a new connection: [Errno 111] Connection refused')))

During handling of the above exception, another exception occurred:

ProxyError                                Traceback (most recent call last)
<ipython-input-9-e3772af85a7a> in <module>
      1 # Get the dataset
----> 2 filardo = requests.get('http://econ.korea.ac.kr/~cjkim/MARKOV/data/filardo.prn').content
      3 dta_filardo = pd.read_table(BytesIO(filardo), sep=' +', header=None, skipfooter=1, engine='python')
      4 dta_filardo.columns = ['month', 'ip', 'leading']
      5 dta_filardo.index = pd.date_range('1948-01-01', '1991-04-01', freq='MS')

/usr/lib/python3/dist-packages/requests/api.py in get(url, params, **kwargs)
     74
     75     kwargs.setdefault('allow_redirects', True)
---> 76     return request('get', url, params=params, **kwargs)
     77
     78

/usr/lib/python3/dist-packages/requests/api.py in request(method, url, **kwargs)
     59     # cases, and look like a memory leak in others.
     60     with sessions.Session() as session:
---> 61         return session.request(method=method, url=url, **kwargs)
     62
     63

/usr/lib/python3/dist-packages/requests/sessions.py in request(self, method, url, params, data, headers, cookies, files, auth, timeout, allow_redirects, proxies, hooks, stream, verify, cert, json)
    528         }
    529         send_kwargs.update(settings)
--> 530         resp = self.send(prep, **send_kwargs)
    531
    532         return resp

/usr/lib/python3/dist-packages/requests/sessions.py in send(self, request, **kwargs)
    641
    642         # Send the request
--> 643         r = adapter.send(request, **kwargs)
    644
    645         # Total elapsed time of the request (approximately)

/usr/lib/python3/dist-packages/requests/adapters.py in send(self, request, stream, timeout, verify, cert, proxies)
    508
    509             if isinstance(e.reason, _ProxyError):
--> 510                 raise ProxyError(e, request=request)
    511
    512             if isinstance(e.reason, _SSLError):

ProxyError: HTTPConnectionPool(host='127.0.0.1', port=9): Max retries exceeded with url: http://econ.korea.ac.kr/~cjkim/MARKOV/data/filardo.prn (Caused by ProxyError('Cannot connect to proxy.', NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f7538df9370>: Failed to establish a new connection: [Errno 111] Connection refused')))

The time-varying transition probabilities are specified by the exog_tvtp parameter.

Here we demonstrate another feature of model fitting - the use of a random search for MLE starting parameters. Because Markov switching models are often characterized by many local maxima of the likelihood function, performing an initial optimization step can be helpful to find the best parameters.

Below, we specify that 20 random perturbations from the starting parameter vector are examined and the best one used as the actual starting parameters. Because of the random nature of the search, we seed the random number generator beforehand to allow replication of the result.

[10]:
mod_filardo = sm.tsa.MarkovAutoregression(
    dta_filardo.iloc[2:]['dlip'], k_regimes=2, order=4, switching_ar=False,
    exog_tvtp=sm.add_constant(dta_filardo.iloc[1:-1]['dmdlleading']))

np.random.seed(12345)
res_filardo = mod_filardo.fit(search_reps=20)
---------------------------------------------------------------------------
NameError                                 Traceback (most recent call last)
<ipython-input-10-b0ee7e59fc01> in <module>
      1 mod_filardo = sm.tsa.MarkovAutoregression(
----> 2     dta_filardo.iloc[2:]['dlip'], k_regimes=2, order=4, switching_ar=False,
      3     exog_tvtp=sm.add_constant(dta_filardo.iloc[1:-1]['dmdlleading']))
      4
      5 np.random.seed(12345)

NameError: name 'dta_filardo' is not defined
[11]:
res_filardo.summary()
---------------------------------------------------------------------------
NameError                                 Traceback (most recent call last)
<ipython-input-11-254b3810b2f9> in <module>
----> 1 res_filardo.summary()

NameError: name 'res_filardo' is not defined

Below we plot the smoothed probability of the economy operating in a low-production state, and again include the NBER recessions for comparison.

[12]:
fig, ax = plt.subplots(figsize=(12,3))

ax.plot(res_filardo.smoothed_marginal_probabilities[0])
ax.fill_between(usrec.index, 0, 1, where=usrec['USREC'].values, color='gray', alpha=0.2)
ax.set_xlim(dta_filardo.index[6], dta_filardo.index[-1])
ax.set(title='Smoothed probability of a low-production state');
---------------------------------------------------------------------------
NameError                                 Traceback (most recent call last)
<ipython-input-12-1a1095e831fe> in <module>
      1 fig, ax = plt.subplots(figsize=(12,3))
      2
----> 3 ax.plot(res_filardo.smoothed_marginal_probabilities[0])
      4 ax.fill_between(usrec.index, 0, 1, where=usrec['USREC'].values, color='gray', alpha=0.2)
      5 ax.set_xlim(dta_filardo.index[6], dta_filardo.index[-1])

NameError: name 'res_filardo' is not defined
../../../_images/examples_notebooks_generated_markov_autoregression_22_1.png

Using the time-varying transition probabilities, we can see how the expected duration of a low-production state changes over time:

[13]:
res_filardo.expected_durations[0].plot(
    title='Expected duration of a low-production state', figsize=(12,3));
---------------------------------------------------------------------------
NameError                                 Traceback (most recent call last)
<ipython-input-13-a30b5c1ed40c> in <module>
----> 1 res_filardo.expected_durations[0].plot(
      2     title='Expected duration of a low-production state', figsize=(12,3));

NameError: name 'res_filardo' is not defined

During recessions, the expected duration of a low-production state is much higher than in an expansion.