Skip to main content

The Econometric Analysis of Mixed Frequency Data and its use in Policy Making

Final Report Summary - MIDAS (The Econometric Analysis of Mixed Frequency Data and its use in Policy Making)

Executive Summary:

Economic time series are sampled at different frequencies as there are still many economic time series that are costly to collect and thus available at a lower frequency. Examples include many macroeconomic real activity series that have maintained the traditional monthly or quarterly collection and release scheme. Typically, the approach adopted in empirical research is to align the data at the lowest sampling frequency and run regressions with same-frequency data. For example, if we study the relationship between financial markets and the real economy, the empirical research usually uses quarterly or monthly aggregates of financial series that align with the available macroeconomic variables. This leads to loss of information, as data sampled at high frequencies is aggregated, and results in either biases or asymptotic inefficiencies of the regression estimates (see Andreou, Ghysels and Kourtellos (2010, Journal of Econometrics)). Our research pertained to methods which accommodate mixed frequency data in empirical research as well as provide tools for policy makers to use the steady flow of high frequency data to help guide their decision making process. The methods rely on so called MIDAS regressions, meaning Mi(xed) Da(ta) S(ampling) regressions that involve data sampled at different frequencies.

One prominent example of MIDAS regression is nowcasting, i.e. ‘forecasting’ the current or recent aggregate state of an economy. The econometric rely on links between economic data available at high frequency – think of financial data – and low frequency data such as GDP. Two recently published surveys feature MIDAS regressions prominently as one of the innovative methods in handling nowcasting (see Castle, Jennifer L., David F. Hendry, and Oleg I. Kitov. "Forecasting and Nowcasting Macroeconomic Variables: A Methodological Overview." (Oxford University, 2013) and Chauvet, Marcelle, and Simon Potter. "Forecasting output." Handbook of Economic Forecasting, Elliott, G., and Timmermann, A.(eds). (2013)).

Central banks around the world were at the frontline of the recent financial crisis. Critical to their policy decisions was the proper reading of the signals embedded in the flow of economic data. Among the research output of the project we coordinated a unique joint research effort across two of the leading central banks, evaluating the forecasting performance of both banks during the crisis – using the internal historical records of forecasts used by policy makers during the crisis (see Alessi, Ghysels, Onorante, Peach and Potter, Central bank macroeconomic forecasting during the financial crisis: the European Central Bank and the Federal Reserve Bank of New York Central bank macroeconomic forecasting during the financial crisis: the European Central Bank and Federal Reserve Bank of New York Experiences, Journal of Business and Economic Statistics (forthcoming), Invited paper ASSA Meetings, Philadelphia 2014). It is to the best of my knowledge the first direct research collaboration between two central banks across the Atlantic. The research shows how use of financial market data and MIDAS regressions could have substantially improved the quality of forecasts during the recent financial crisis. The wider impact of this research is clear. Whenever the next financial crisis happens – hopefully not any time soon – we will have better tools which will help policy makers make more informed and in particular timely updates about the state of the economy and its predictions of its path in the immediate and medium term horizon. The socio-economic and societal impact of these improvements are substantial, considering the fact that the crisis had a dramatic contraction by about 4.5% in Gross Domestic Product (GDP) in 2009 for the EU as well as the Euro zone countries.

The project also led to the organization of one of the most important conferences in the field in Europe. The EC-squared is a series of annual international conferences on research in quantitative economics and econometrics, launched in 1990. The acronym (EC)2 stands for European Conferences of the Econom[etr]ics Community. Its main aim is to maintain and extend an adequate forum for both senior and junior European researchers in quantitative economics and econometrics to discuss the progress and results of their research. The conferences are scheduled in the middle of December and last for 2 or 3 days. They are of relatively small scale (less than 100 participants) and are very intensive. Each year a different topic of interest is selected as the major theme of the conference. A few leading quantitative economists or econometricians are invited as keynote speakers; the other speakers are selected on the basis of submitted papers, and several participants act as invited discussant. Between sessions of the plenary programme there may be poster sessions, probably with computer demonstrations and round-table discussions. There are no parallel sessions. Although only few participants can perform at the plenary sessions usually most participants do present their research at one of the other conference activities. The 2013 EC-squared conference will be hosted by the University of Cyprus and the theme is The Econometric Analysis of Mixed Frequency Data, i.e. the topic directly related to the Marie Curie grant. Elena Andreou heads the local organizing committee and Eric Ghysels, together with Massimiliano Marcellino from Bocconi University head the scientific committee. Invited speakers include Michael Clements (UK), Manfred Deistler (Austria), Domenico Giannone (Belgium) and Rossen Valkanov (US). Since the conference is always held in December it took place after the completion of the Marie Curie project, but al lthe planning and organization took place during the term of the grant

The webpage of the conference is: