DSGE models are the standard tool of quantitative macroeconomics. We use them to measure economics phenomena and to provide policy advice. However, since Kydland and Prescott s 1982, the profession has fought about how to take these models to the data. Kydland and Prescott proposed to calibrate their model. Why? Macroeconomists could not compute their models efficiently. Moreover, the techniques required for estimating DSGE models using the likelihood did not exist. Finally, models were ranked very badly by likelihood ratio tests. Calibration offered a temporary solution. By focusing only on a very limited set of moments of the model, researchers could claim partial success and keep developing their theory. The landscape changed in the 1990s. There were developments along three fronts. First, macroeconomists learned how to efficiently compute equilibrium models with rich dynamics. Second, statisticians developed simulation techniques like Markov chain Monte Carlo (MCMC), which we require to estimate DSGE models. Third, and perhaps most important, computer power has become so cheap that we can now do things that were unthinkable 20 years ago. This proposal tries to estimate non-linear and/or non-normal DSGE models using a likelihood approach. Why non-linear models? Previous research has proved that second order approximation errors in the policy function have first order effects on the likelihood function. Why non-normal models? Time-varying volatility is key to understanding the Great Moderation. Kim and Nelson (1999), McConnell and Pérez-Quirós (2000), and Stock and Watson (2002) have documented a decline in the variance of output growth since the mid 1980s. Only DSGE models with richer structure than normal innovations can account for this.
Call for proposal
See other projects for this call