We compare a number of data-rich methods that are widely used
in macroeconomic forecasting with a lesser known alternative:
partial least squares (PLS) regression. In this method,
linear, orthogonal combinations of a large number of
predictor variables are con- structed such that the
covariance between a target variable and these common
components are maximized. We show theoretically that when the
data have a factor structure, PLS re- gression can be seen as
an alternative way to approximate this unobserved factor structure.
We also prove that when a large data set has a weak factor
structure, which possibly vanishes in the limit, PLS
regression still provides asymptotically the best fit for the
target variable of interest. Monte Carlo experiments confirm
our theoretical results that PLS regression is at least as
good as principal components regression and close to Bayesian
regression when the data has a factor structure. When the
factor structure in the data is weak, PLS regression always
outperforms principal components and, in most cases, Bayesian
regressions. Finally, we apply PLS, principal components, and
Bayesian regressions on a large panel of monthly U.S.
macroeconomic data to forecast key variables across different
subperiods, and PLS re- gression usually has the best
out-of-sample performance.
Rotterdam Seminars Econometric Institute
- Speaker(s)
- Jan Groen (Federal Reserve Bank of New York)
- Date
- 2009-09-03
- Location
- Rotterdam