Skip to main content

Financial Distortions and Macroeconomic Performance: Expectations, Constraints and Interaction of Agents

Final Report Summary - FINMAP (Financial Distortions and Macroeconomic Performance: Expectations, Constraints and Interaction of Agents)

Executive Summary:
Motivated by the 2008 financial crisis and the rising doubts in the proper functioning of financial markets, FinMaP has been launched to develop elements of a new emerging paradigm for understanding the reasons of the apparent imperfections and dysfunctionality of financial markets, and for developing new analytical tools that should help monetary policy to define an appropriate strategy in response to these new challenges. The main impact of the project, therefore, lies in the development of decision support tools, new models and methodologies that are of potential usefulness for the analysis of financial markets, their interaction with the real economy, the transmission mechanism of monetary policy and the effects of regulatory reforms. These methods and models are designed to be applicable by monetary authorities and regulators.
The main material research questions of our project have been:
• The integration of binding credit constraints and liquidity freezes in macroeconomic models,
• The identification of speculative bubbles and development of pertinent early warning indicators,
• The exploration of the behaviour of agents under risk and uncertainty, and of the possibility of monetary policy to influence agents’ expectation formation to prevent ‘bad’ outcomes,
• The modelling of financial institutions, their interactions and how they interact with monetary authorities in the transmission of monetary policy.
Our research has used a broad portfolio of methodological approaches, some of which are used traditionally by central banks, and others which are relatively near but of high relevance as newly developing decision support tools for monetary policy. Our methodological portfolio included: (i) stochastic dynamic general equilibrium (DSGE) models that before the crisis have been the dominating tool used in central banks of developed and developing economies for policy analysis and macroeconomic forecasting. Our research has addressed the major limitations of these models that have come to the fore with the financial crisis of 2007/8, namely the absence of financial markets, a banking sector and financial frictions that have macroeconomic repercussions in the real sector of the economy, (ii) agent-based models (ABMs) that have been promoted as a promising alternative avenue in the face of the methodological limitations of the DSGE apparatus. We have developed ABMs for particular segments of the financial sector, as well as for a comprehensive model of the banking sector and its interaction with the real sphere. We have also started to use these models as a new computational platform to conduct counterfactual policy analyses, (iii) network models of the financial sector: This new class of models has gained enormous attention as a tool to investigate questions of systemic risk due to high connectivity within the financial sector, possible chains of contagious defaults and macroprudential policy to make the financial sector more resilient. Since 2008, this has become an important decision support tool for regulators, (iv) experimental research on the formation of expectations in financial markets, and the problem of interaction between private and public information. The main objective of this strand of our research was to help monetary authorities to fine-tune their communication strategy and to help evaluate the work of rating agencies and related institutions issuing publicly available information signals that market participants are interpreting in the light of this private information.

Project Context and Objectives:
The first pillar of our research project has been the analysis of individuals´ expectation formation under uncertainty and bounded rationality as well as their choice of strategy when operating in financial markets in which they are confronted both with exogenous as well as endogenous (uncertainty on the reactions of other market participants) sources of risk. Our research on this topic has been based on laboratory experiments as well as empirical research using survey data. Our laboratory experiments have mainly been devoted to the important question of the interaction between private and public information. In real life settings, the latter include, on the one hand, ubiquitous information on ratings of economic entities (companies, sovereigns), but, on the other hand, also targeted dissemination of public information by policy actors such as central banks. We were mainly interested in how far the availability of public information could come at the cost of reduced private activity of information acquisition. We indeed found in our laboratory experiments that the availability of more public information has a tendency to reduce agents’ efforts of collecting independent information. Publicly available information might, thus, act as a coordinating device for agents´ expectations. However, given that public signals need not necessarily be correct (e.g. ratings of CDOs based on subprime mortgages during the built-up of the financial crisis), the negligence of private information acquisition could actually prove detrimental for the efficiency of the price formation process.
Expectation formation processes have also been investigated in important field data, i.e. the German ZEW Indicator of Economic Sentiment and the Livingston survey of US stock market sentiment. The empirical analysis has revealed important asymmetries and nonlinearities in agents´ reaction to recent information. In particular, it has been found in both data sets that a short streak of negatively connotated information events triggers expectations of mean reversion towards “business as usual” while agents tend to become extremely pessimistic after a longer streak of negative news. This suggests the existence of a potentially vicious feedback in which a moderate downturn could develop into a lasting scenario of a self-fulfilling depressed state of beliefs of consumers, producers and investors.
To study the emergence and breakdown of speculative bubbles, a reduced form model of speculative interaction in the form of a mathematical catastrophe model has been estimated which is able to model dynamic systems subject to discontinuous dramatic changes of behavior. Applying this methodology to US stock market data, we found that large changes in the eighties (e.g. the stock market crash of October 1987) could be identified as bifurcations triggered by endogenous market dynamics while the more recent phases of prolonged downturns appeared to have been triggered by exogenous forces.
Empirical analysis on the base of micro data on credit links between banks showed that this market is very much governed by lasting relationships that often are honored by granting preferential interest rates to counterparts to whom a long mutual relationship exists. This feature, however, makes this market also vulnerable to structural disruptions upon a loss of trust in the liquidity of some banks which might lead to widespread withdrawal from the market. Indeed, it appears that particularly the largest banks started to hoard liquidity when the financial crisis broke out thus cutting off many smaller counterparties from the possibility of balancing liquidity shortages. The mechanics and behavior within the interbank market have been modelled using an agent-based approach. These models allow studying contagion effects after single bankruptcies and their dependency on behavioral regularities at the micro level and emerging macro patterns of the market as a whole. These studies confirm on a more detailed level the finding of previous literature that the density of the network is in a non-monotonic way related to its fragility or robustness: While more connections help to share risks, they also constitute channels for the propagation of stress throughout the system.
Both the interbank and the stock market also constitute building blocks of a more comprehensive model that captures additional layers of the financial side (the primary and secondary market for bonds) and the real sector. In this multi-market agent-based framework, we find that a number of empirical stylized facts emerge endogenously from the interaction of our agents confirming the validity of the model as a potential replica of the central components of financial-real interaction. We have used this model to study the effect of leverage on the frequency of systemic events and how increased losses on loans can lead to a detrimental downward spiral of liquidity via fire sales of assets by illiquid banks. Explaining the effects of “unconventional” monetary policy within this framework, we find that the central bank could mitigate this vicious feedback via access to improved refinancing facilities.
The links between the financial and the real sector have also been analyzed from different perspectives. We examined the network spanned by loans extended by banks to commercial firms. We developed a model of link formation so as to capture its basic stylized facts. The reaction of this network to shocks shows pronounced nonlinear behavior: While the default of a single unit mostly does not cause any significant knock-on effects, in some cases a full-scale collapse arises. Identification of the “dangerous” units proves difficult as rather than firm-specific characteristics it is their exact positon in the network that determines their damaging potential.
Besides these agent-based approaches, we have also studied the link between the financial and the real sector within more traditional Dynamic Stochastic General Equilibrium (DSGE) models. Here our main goal has been to integrate portfolio choice into the DSGE apparatus. Our new models in this vain have been applied to study monetary and macroprudential policy in the light of the linkages between current account deficits and financial vulnerability of a country. We find that external shocks are important in driving current account deficits, and that monetary policy can improve welfare in such a context by allowing for an interest-rate response on financial variables. A model with heterogeneous investors shows that constraints on collateral lead to an amplification of business cycles. Another application concerns macroprudential policies to curb excessive household debts. It is found that countercyclical imposition of loan-to-value ratios would improve welfare for all agents in the economy.
While all variants of ABM and DSGE models have been designed to shed light on the effects of monetary and macroprudential policy from various angles, we have also particularly scrutinized the very particular situation policy makers are facing in the Eurozone. On the empirical side, we have conducted a detailed analysis of the potential role of animal spirits in the large spreads of the Eurozone sovereign bond markets. Our analysis indicates that the widening of spreads during the European debt crisis had its origin partly in deteriorating fundamentals of some countries, but to a non-negligible extent they were also the result of collective movements of panic and fear. This finding supports the necessity for the ECB to intervene in the sovereign bond market to curb a speculative bubble with potentially strong negative effects for the real economy. On the theoretical side we have designed an ABM with financial frictions on the supply side to cover important characteristics of the Eurozone dynamics in which we analyze the effect of different strategies of monetary policy as well as fiscal austerity. The model is capable of explaining booming production during phases of optimism, but also that markets are fragile when market sentiment reverses.
In line with several empirical studies for European countries, our model shows that a weak and anemic supply side can be the key driver for a recession. It works as the propagator for shocks originated elsewhere, e.g. the subprime crisis. Further, we have analyzed potential shortcomings in the design of the Eurozone that have led policymakers to adopt problematic macroeconomic policies given the economic situation in the periphery countries since the eruption of the sovereign debt crisis. We argued that a budgetary union can provide the necessary framework to deal with the still unsynchronised business cycles of the Eurozone countries. By centralising part of the national budgets into a common budget managed by a common political authority, the various increases in budget deficits following from a (common) recession would translate into a budget deficit at the union level. As a result, the destabilising flows of liquidity between countries during recessions should disappear, and the common budgetary authority can allow the automatic stabiliser in the common budget to perform its role in smoothing the business cycle.
Using a variety of empirical and theoretical approaches the project results provide a comprehensive analysis of the driving forces leading to financial stress, overreactions and decoupling of financial markets from real activity and the danger of systemic breakdowns of important parts of the financial system. It provides evidence on how these deficiencies and looming critical changes can be efficiently addressed by monetary and macroprudential policy within a variety of alternative theoretical frameworks. The insights from the project should constitute valuable input for monetary policy in the Eurozone and beyond.

Project Results:
WP 1: Expectation Formation and Behavior under Uncertainty
The main objective of this work package has been the analysis of the information aggregation process in markets as a function of the investors’ access to a spectrum of heterogeneous sources of information. We focus on the influence that public institutions (such as central banks, rating agencies, the IMF or European Commission) have on the investors’ demand for information and their expectation formation mechanism. In particular, we have conducted experimental research on laboratory financial markets with the simultaneous existence of public and private sources of information. Such research aims at designing better strategies for policy institutions when communicating with the “markets” and the general public. We consider that announcements of public institutions may generate a potential trade-off between an increase in market liquidity due to the market entry of a larger number of investors (whose information set relies mainly on the public announcement) and the potential distortive effects of public announcements on the aggregation of information into prices.
1.1 Experimental research on investors’ expectation formation
Laboratory experiments have been shown to be a very useful method to test predictions derived from theoretical models and to identify situations in which information will or will not be aggregated into prices. In the theoretical literature we find little agreement on whether the release of public information is beneficial or detrimental for market performance or social welfare. Therefore, we have conducted a set of experimental studies of the information aggregation process in a market as a function of the access to different sources of information, namely an imperfect, but costless public signal in a market where the participants also have access to costly and imperfect private information. We observe that the release of public information provokes a crowding-out effect on the traders' demand for private information, while the overall level of information in the market remains constant. Therefore, we would expect market prices to behave similarly with and without public information since the reduction in private information is compensated by the public information released. However, in those markets where public information is released, the market’s ability to correctly aggregate all information is significantly reduced.
Our main hypothesis is that traders’ overweighting of public information (in their expectation formation mechanism) is the main responsible factor for the reduction in price efficiency observed in our experimental markets. In fact, our study is the first to identify and quantify such an overweighting phenomenon. Our experiments provide empirical support to this conjecture and they show that this might be an important effect to be considered when dealing with private and public information. When evaluating the effect of different relative degrees of precision of public and private information, we observe that overweighting of public information increases with a higher relative precision of public information (with respect to private), whereas when the private information available to traders is more precise than the public information, overweighting is less pronounced. Our main conclusion is that the presence of public information distorts both, the market price informativeness and, as a consequence, the market performance. Such an effect can lead financial markets towards a situation where the price becomes disconnected from fundamentals.
1.2 Design of information policy to stabilize markets
Expanding on the results of the previous experiments, we have defined two main objectives for the subsequent step within this work package: on the one hand, testing the impact of increasing the number of independent public institutions releasing information and, on the other hand, scrutinizing, using laboratory experiments, the relevance of different features of public information, such as the degree of ambiguity, transparency or precision. Increasing the number of independent public institutions releasing information is an issue within the European Union conceptual framework to reduce reliance on external credit ratings. Among the measures proposed in the current debates on regulatory policy, one important issue is the development of mechanisms to improve the conditions for effective competition on the concentrated market for credit rating agencies (CRA). In fact, the CRA III regulation includes a set of measures to increase competitiveness and to support the growth of smaller CRAs within the rating market. The main focus of these measures is to strengthen the investors’ own credit risk assessment and reduce their sole reliance on credit ratings.
We have implemented an experimental market with an endogenous acquisition of private information and compared two different settings: In a first setting, public information is released as a single signal as implemented within Section 1.1. In a second setting two independent public signals with the same precision are released. The results of this research show that if a given degree of public information is released in the form of two independent public signals with lower precision (instead of a single one with higher precision), traders' effort to gather private information is significantly increasing, enhancing as a consequence the market informativeness as well. An increase in market informativeness does not translate, however, into higher market efficiency, when the two signals are contradictory on their prediction of the fundamentals, because in this case the aggregate public information does not provide any new information to the market.
To obtain a better understanding of the mechanism responsible for traders’ overweighting of public information, we studied traders' strategies in the presence of public information and the role of traders with different information in the coordination of expectations on the public announcement. In particular, we were interested in the influence that uninformed traders have on market liquidity and stability in scenarios with public and private information compared with those with only private information. We found that correct public information greatly helps the aggregation of private information into prices by coordinating traders' actions towards the true state of the world. When it is incorrect, though, this distorts the aggregation of private information into prices, coordinating traders' actions towards the wrong state of the world. This “wrong” informational cascade is more likely to be observed, the higher the number of un-informed traders in the market.
Our main conclusion is that, under some circumstances, the release of public information can distort the aggregation of the information into prices. As a policy advice we recommend that eventual reforms on the regulation of financial institutions (for instance the credit rating agencies) should account for the complex interplay between private and public information that we have identified in our experiments and give incentives to the investors to search for alternative sources of information.
1.3 Experimental research on investors’ reaction to qualitative information
Traders in financial and credits markets are continuously exposed to a variety of announcements and statements coming from different sources: heads of central banks, funds or credit rating agencies, politicians, etc. The very nature of such public statements is not sharp or quantitative but rather qualitative or fuzzy. It is, therefore, relevant to develop some realistic behavioral rules for the traders’ expectation formation, to better understand the impact of such announcements via the mechanism used by agents to translate such a heterogeneous set of information into an investment strategy. In order to study how announcements affect investors’ behavior in the market, we should first be able to observe how they form their expectations considering the whole time spectrum, meaning short and long-run expectations. The main objective of our research in Task 3 has been to develop an effective tool to elicit short and long-run expectations and their changes.
In order to study the evolution of the whole spectrum of expectations, we use controlled laboratory experiments, which have the advantage of a perfect monitoring of the information available to the agents at any period of time. In particular, we have conducted a Learning to Forecast Experiment (LtFE, hereafter), following Marimon and Sunder (1993). In a standard LtFE subjects submit (one step-ahead) predictions about future values of economic variables, typically prices, which are endogenously determined as a function of those predictions. Subjects are rewarded based on their forecast accuracy. The empirical evidence of LtFEs with strong positive feedback between expectations and realized market prices shows a persistent coordination of short-run predictions, although not always on the Rational Expectations Equilibrium (REE, hereafter).
In our research we incorporate into the LtFEs the elicitation of long-run expectations in order to study the dynamics of the whole term structure of expectations. Such modification allows us to have a more comprehensive understanding of the agents’ expectations formation mechanism.
In the following we will briefly describe our novel experimental setting: In each session 6 subjects submit their predictions for the price of an asset for 20 periods. At the beginning of period t, each subject submits her short-run prediction for the asset price at the end of period t, as well as her set of long-run predictions for the price at the end of each of the 20−t remaining periods. When submitting their predictions subjects are not informed about the predictions submitted by the other subjects and have just qualitative information on the price generating mechanism: there is a positive relationship between their one-step ahead predictions and the next realized price.
To keep our setting simple, we explicitly exclude long-run expectations from the price generating mechanism. We expect, therefore, the elicitation of subjects’ long-run expectations not to have a significant impact on the price dynamics. On the contrary, we do expect an influence of price dynamics on the formation and evolution of subjects’ long-run expectations.
Our results show that subjects’ expectations are not consistent with the REE, neither in the short nor in the long-run. Subjects’ expectations, instead, can be described using an adaptive learning scheme. Interestingly, subjects’ predictions are centered on the last price, independently of the horizon of their predictions. The last realized price constitutes, then, a time-varying anchor or, using the terminology of Morris and Shin (2002), a focal point for subjects’ expectations coordination. Moreover, when revising their expectations, subjects consider their past prediction errors computed within a time window proportional to their prediction horizon. Interestingly, we observe a significant asymmetry in the speed of adjustment, which ultimately is responsible for the lower level of consensus of subjects’ long-run expectations. This asymmetric adjustment could be a consequence of the fact that subjects experience an immediate feedback on the accuracy of their short-run predictions, while there is a delay when evaluating the accuracy of their long-run predictions.
We believe that our novel experimental setting is a valuable tool to study the whole time spectrum of subjects’ expectations and test different hypotheses on expectations formation. Future research will be devoted to conduct experiments where new information at the disposal of subjects is not limited to the time series of prices, but includes other information sources, such as aggregate information on subjects’ long-run expectations, public announcements of policy measures or future changes of the fundamentals.

1.4 ABM modeling based on experimental findings
In this task we have complemented our experimental research by formulating a behavioral model to explain our experimental results, reproducing the main features of our experimental setting and conducting computer simulations to compare the results with our experimental findings. Controlled laboratory experiments and computer simulations have in common the possibility to have, at each instant of time, a complete monitoring of the information possessed by each trader and, at the same time, a detailed record of his or her trading activity.
We have designed a simple theoretical model of an artificial financial market based on the experimental design to explain the coordination of prices around the public signal released by a regulatory institution. Traders in this market can trade risky assets based on costless noisy information about their asset value. The artificial market is populated by bounded rational traders with heterogeneous cognitive capabilities: (i) naive traders, who decide about their trading strategies based on their own information set only, disregarding the trading motives of the other agents and (ii) sophisticated traders, who take into account the entire distribution of information across traders, acting strategically. In this framework, the release of public announcements generates, through the complex market interactions, a coordination of the market activity around the public information released.
As expected, public information becomes a focal point when it is released and this outcome strengthens even when the proportion of non-correctly informed subjects increases. Hence, heterogeneity combined with bounded rationality and risk neutrality generates theoretical results in our model that are in line with our previous experimental findings. We find a clear coordination effect around a correct or incorrect public signal with a low proportion of sophisticated traders in the market (i.e. 20% of sophisticated traders)
Our model, combined with simulations, therefore provides a deeper understanding of the influence of public information impact on traders' strategies and market prices. In order to study the relevance of finite size effects of our theoretical model we have run Monte Carlo simulations with a finite number of agents, N=15, as in the experimental design described before. Fixing the proportion of sophisticated traders to 20% we could observe that our theoretical model is even in good quantitative agreement with the experimental results when we consider fluctuation in the configuration of the investors’ population regarding their private information.
We, thus, find that the model provides a good foundation for the analysis of the effects of the release of public information in financial markets. Our experimental setting, supported by the theoretical model and computer simulations, can therefore be used as a realistic testbed in order to assess the performance of different communication strategies to smooth the adverse effects of the release of public information, while preserving its positive effects.
WP 2: Identification and Modelling of Speculative Bubbles
The second work package has been concerned with empirical research attempting to uncover the behavioral regularities of financial bubbles and identifying early-warning signals of speculative bubbles and other forms of market failures. The building up and bursting of (speculative) price bubbles can be seen as a stylized fact of financial markets. Given the recent experience, it appears obvious that mispricing of financial assets on a global scale can trigger financial crises. The possible reasons for such phenomena, which are heavily debated in the literature, include excess liquidity due to expansive monetary policy or psychological explanations such as herding or moral hazard. Another on-going debate to which we contributed is whether speculative bubbles can be identified. For this analysis, we used both raw financial data from various markets as well as survey data from different sources that provide intuition on market participants' expectations. We have estimated regime-switching models which are natural candidates for identifying the shift to a crisis period and the accompanying build-up and bursting of a speculative bubble. Alternative approaches, including frequency domain and wavelet-based techniques, for the identification of structural changes in the behavior of a time series have been examined as well.
2.1 Determination of commonalities among financial markets
As a preliminary step towards behavioral modelling, the work conducted in this part of the project consisted of data-analytical and econometric analyses of a variety of asset classes and of the co-movements of returns and volatility between them. We used advanced econometric methods to study time variation in parameters, volatility clustering, heteroscedasticity, non-stationarities, long-term memory, regime switching and other characteristics which are crucial factors of price formation in financial markets. Although we have also examined standard asset classes such as stocks, stock indices, commodities and foreign exchange rates, we focus here on our work on credit default swaps (CDS) due to space constraints. The co-movement between CDS and other asset classes has become a topic of interest because of the role of CDS in the propagation of distress during financial crises.
CDS are contracts providing protection to the buyer of the underlying asset in the case of its default. The market for CDS is huge with a notional outstanding volume at the level of trillions of US dollars. This high liquidity renders the CDS market appealing to study the determinants of credit risks, and the co-movements with other asset markets. We have investigated the impact of a changing market environment on the development of CDS spreads written on debt from Euro Stoxx 50 firms. A Panel Smooth Transition Regression reveals that the relevance of different fundamental factors on the spreads of these CDS is strongly time-varying. As relevant fundamental factors we identify the ECB’s systemic stress composite index, the Sentix index for current and future economic sentiment, and the VStoxx, an implied volatility index based on standard options for the underlying Euro Stoxx 50 index. These variables drive the market’s transition between different regimes thereby reflecting the impact of substantial swings in agents’ risk perception on CDS spreads.
We also find that CDS spreads show different reactions at different times to leverage ratios of firms. In the period between 2004 and 2007 investors' “risk appetite” appeared to have been exceptionally high leading to high leverage ratios and low CDS spreads. During crisis periods, in contrast, higher leverage appears to be accompanied by higher CDS spreads in line with the expectation that increasing leverage ratios are connected with severe liquidity problems. We have also studied the dependence or co-movement between the CDS and the equity market. An increase in equity volatility coincides with an increase in firms' CDS spreads, especially during the crisis periods. Overall, the results demonstrate the varying importance of different explanatory factors in the pricing of risk derivatives during tranquil and turbulent times, and in particular, the changing role of leverage as a factor increasing both expected profits and risk.

2.2 The role of survey data in defining and measuring bubbles
The recent financial crisis has also cast doubts on the usefulness of standard asset pricing models in explaining large and persistent recurrent run-ups in financial markets, especially in equity markets. As a consequence, researchers have more and more felt the necessity to take into account the possibility that potentially biased beliefs could drive the decisions of market participants. To shed light on this issue and determine the structural properties of agents’ belief formation, we have used economic survey data.
Evidence from experiments and empirical studies has since long nourished doubts on the validity of the rational expectation hypothesis (REH). A voluminous body of theoretical work incorporates various forms of deviations from the REH. One promising approach that we have pursued is a “quasi-Bayesian” framework of expectation formation along the lines of the seminal contributions by Barberis, Shleifer, and Vishny (1998) (hereafter BSV) and Rabin (2002). In the BSV model, a decision maker forecasts the earnings of an asset by using one of two possible heuristics, trend continuation or trend reversals. In case earnings follow, for instance, a random walk process, the decision maker would incorrectly interpret these random changes as a stochastic process switching between a reversal and a continuation regime. The BSV model thus implies that the predicted probability of a positive (negative) outcome in the next period is increasing in the number of consecutively observed positive (negative) outcomes in the past. In Rabin (2002), decision makers approach their forecasting task with the prior belief that outcomes are generated by repeated draws from an “urn” with unknown properties. The decision maker then needs to infer the distribution of the different “balls” for the possible direction of the process in the urn. This Bayesian heuristic can generate two well-known behavioral “anomalies”. The first is the “gambler's fallacy” under which decision makers wrongly believe that early draws of a positive signal increase the probability of negative signals in the short run. The second one is the “hot-hand effect” which means that after observing a streak of the same outcome, decision makers change their expectations of urn frequencies toward the more frequently observed outcome and erroneously expect that the streak will continue. These two effects result in an overestimation of the probability of a short streak reversing and that of a long streak continuing. Both models differ qualitatively in their predictions for short streak lengths. In the BSV model, the belief in trend continuation increases monotonically in streak length. In the Rabin model, the decision maker forecasts a reversal for short streaks and trend continuation for longer streaks.
To investigate the effect of qualitative statements on the expectation formation, we have fitted the two behavioral models of BSV and Rabin to the forecasting data of the Economic Sentiment Survey of the Centre for European Economic Research, Mannheim. This field data exhibits certain behavioral patterns. Given a streak of past negative surprises, forecasters appear to expect a reversal after a few negative signals, but become pessimistic once a long streak of negative signals occurs. On the other hand, forecasters become slightly optimistic for a streak of positive surprises, regardless of its length. Estimating the BSV and Rabin models shows that the latter provides a better performance for negative streaks while BSV appear to provide a better explanation for agents’ behavior after positive streaks.
In a closely related exercise, we have used a Panel Smooth Transition Regression model to empirically study nonlinearities in the expectation formation process in the US stock market. The underlying data stems from the Livingston survey and we pursued the question of how the relative importance of regressive and extrapolative expectations varies over time as market conditions summarized by stock market misalignments and recent returns change. It is found that large misalignments of stock markets coincide with a strong expectation of mean reversion while small misalignments are expected to diminish slowly. Therefore, survey participants appear to form stabilizing expectations in the long run. Short-run expectations, in contrast, are consistent with only relatively weak mean reversion of stock prices.
2.3 Time-Frequency methods for understanding heterogeneity of economic agents
This part of our empirical analysis has been concerned with the identification of economic relationships using time-frequency methods. The empirical behavior of economic and financial variables is often changing across different frequencies due to, for instance, seasonality or business cycle dynamics. In this part of the project, we have both used existing methodologies but also have developed new methods to disentangle the relationships between important economic and financial variables across various frequencies.
First, we have developed a new technique to study the determinants of the volatility dynamics of financial assets using wavelets. Wavelets are a convenient and efficient way of representing complex variables as they can disentangle the determinants of a time series into different frequency components (which might emerge due to heterogeneous time horizons of economic actors). They are especially useful where the influence of a variable lasts only for a finite time, or shows markedly different behavior in different time periods. Within this framework, we have used a new measure of implied volatility on S&P 500 and DAX options with monthly maturities. It was found that this new measure of implied volatility provides an unbiased forecast of realized volatility in the long term horizon over one month. This result confirms that measurement of volatility implied by option prices is important for volatility forecasts. However, it shows that the details of selecting the relevant set of options do matter for the quality of forecasts and that current implied volatility measures can be improved by a judicious selection of the relevant options.
Secondly, we have used time-frequency methods to decompose the relationship between financial and macroeconomic data into components at different frequencies. To this end, we used a general framework for measuring frequency dynamics of connectedness in economic variables based on the spectral representation of variance decompositions. Such an approach provides new insights into the time-varying comovements of variables as shocks with heterogeneous frequency responses will create frequency dependent connections of different strengths that would remain hidden when only time domain measures were used. We document strong short-term dynamics of connectedness of the volatility of major US stocks in our first application. We conclude that their dynamics are mainly driven by frequencies from one day up to one month, although this does not hold in periods of turmoil with high levels of uncertainty. Economically, periods with a high degree of connectedness at high frequencies are periods when stock markets seem to process information rapidly. In case the connections come from the lower frequencies, shocks will be transmitted over longer horizons. This behavior may be attributed to fundamental changes in investors' expectations. Turning our attention to macroeconomic data, our measures demonstrate the importance of a proper assessment of common stochastic trends. An application of the methodology to industrial production of G7 countries reveals that the shock transmission from one country to another depends on the state of the business cycles. Thus, different degrees of connectedness are due to different countries being in different phases of their business cycles when a particular shock hits.
2.4 Identification and early warning indicators of speculative bubbles
Finding warning indicators for speculative bubbles is an obvious objective for bringing the insights of various econometric analyses of behavioral biases, deviations from fundamental valuation and correlations between assets to practical fruition. In this section we attempted to develop warning indicators for stock and credit markets which have been in the focus of interest during the last financial crisis.
To this end, we studied the dynamics of a stylized model of speculative dynamics within the analytical framework of a stochastic “catastrophe” model (in the mathematical sense of “catastrophe” indicating an abrupt change of the state of a system). In particular, we have developed a two-step estimation methodology that allows applying catastrophe theory to stock market returns with time-varying volatility. The methodology has been empirically tested on nearly 27 years of US stock market returns (1984-2010) covering several important recessions and crisis periods. While we find that the stock markets showed signs of bifurcations in the first half of the period, the model was not able to identify such tendencies in the second half. Interpreting the results, we conclude that the US stock market’s downturns were more likely to be driven by endogenous market forces during the first half of the period, while during the second half of it; exogenous factors seemed to have been responsible for most of the large downward swings.
We have also developed a set of alternative early warning indicators for credit markets. Our approach is based on statistics characterizing the network structure of credit extended by banks to the non-financial corporate sector. Calibrating the model on a sample of firms and banks quoted on the Japanese stock exchange from 1980 to 2012, we found that both an increase of the volume of credit and of connectivity are positively correlated with the probability of a subsequent crisis and their combination represents an effective early warning measure. We further show that when systemic risk increases beyond a certain threshold, banks start to curb lending to more indebted firms which decreases output volatility without causing an overall credit contraction. Cutting loans to riskier firms permanently, however, reduces output volatility and, hence, systemic risk, but at the price of a lower average output level.
The credit market is further studied by developing a model of heterogeneous agents to explain the dynamics of the European sovereign bonds market. Agents are assumed to make use of different information from the CDS market and the historical price movements of the sovereign bonds for their trading decisions. Depending on perceived risk, agents exhibit changing trading behaviors in high risk periods and tranquil times. We estimated our agent-based model with a smooth transition regression framework using German and Greek bond spread data. It turned out that crisis signals were indeed absent in the German bond market during the sample period, while the Greek one has been affected by multiple crises. In contrast to standard models, our results indicate that the Greek sovereign bond entered a risky period immediately after the bankruptcy of Lehman Brothers. Therefore, our model provides additional valuable information regarding the early prediction of crisis events.
WP 3: Estimation and Validation of Complex Macroeconomic Models: Methodological Development and Applications
In the aftermath of the financial crisis agent-based models (ABM) received more and more interest as they constitute potential alternatives to “traditional” macroeconomic and finance models which were unable to explain or forecast the recent crisis. ABMs replace the concept of “representative agents” by agents that can display diverse, dynamic and interdependent behavior. The goal we have set ourselves for this work package was to advance the development of modeling tools and statistical/econometric approaches for the validation of such ABMs, and to compare ABMs with “traditional” macro/finance models. Rigorous parameter estimation of ABMs should allow us to assess their explanatory power and to select successful candidate models for particular purposes (modeling of investor behavior, interbank interactions, real-financial interaction) via their conformity with key empirical features. The identification of such models provides us with a better understanding of the financial crisis and provides important insights for future economic policy.
3.1 Methods for validation of agent-based models
Appropriately designed ABMs allow reproducing the main stylized facts of financial returns such as fat tails of returns and volatility clustering. While the details of models with realistic time series dynamics might differ, most of these models share a combination of centripetal and centrifugal forces that prevent asset prices to settle down at the benchmark fundamental equilibrium under rational expectations. Rather, the interaction of different trading strategies (chartism, fundamentalism, or more refined alternatives) leads to complex fluctuations of prices along with phases of over- and undervaluation, speculative bubbles and crashes. An important step in the development of this field of research is the empirical validation of such models based on high-frequency data from various types of financial markets. This section is devoted to this task. While a few attempts to estimate these models do exist, most of this recent literature, however, lacks a systematic exploration of the performance of their proposed estimation algorithms. We attempted to close this particular gap in the literature by more systematically exploring the performance of various estimators.
We took the model of Alfarano et al. (2008) as a prototype ABM for the interaction of heterogeneous groups of investors in a typical financial market. The robustness of this model in generating the main stylized facts in a most elementary framework of interaction of an ensemble of agents makes this an attractive benchmark. Since much of the literature on heterogeneous agents with different trading strategies, bounded rationality or interactions among agents targets particular statistical features of financial returns, the use of moment-based estimators using exactly these features in their objective function seems tailor-made for bringing these models to the data. The basic idea is to analytically compute certain moments, or to generate simulated series from the economic model, and then match their moments with those computed from the data.
We consequently explore two methods of estimating this ABM, generalized method of moments (GMM) and simulated method of moments (SMM). The GMM method requires that a certain number of moment conditions can be analytically solved for the model. One then minimizes a certain norm of the sample averages of the moment conditions. In contrast to many related ABMs in the literature, Alfarano et al. (2008) have already derived a set of moments which enables us, after extending it to allow for changes of fundamental values, to estimate the model with analytical moments. The SMM approach is the simulated counterpart of the GMM procedure and is still applicable if the theoretical moments cannot be computed analytically. Comparison of both approaches revealed a much lower efficiency of SMM relative to GMM than expected on theoretical grounds. Its estimation errors remain much higher than those of the GMM estimates even with extremely large simulated samples. It appears that this feature is due to the limited range of moments available in univariate asset pricing models. As a consequence, the relatively low efficiency of the SMM estimator in our framework could carry over to many related agent-based models of financial markets as well as to similar diffusion processes in mathematical finance that all would have to be estimated on the base of the same limited set of moments.
3.2 Empirical validation of selected stock market models with interacting agents
After having studied the efficiency of SMM and GMM methods using simulated data, we applied both estimators to real data. This is a further test for the methodology developed in the previous section and allowed us examining whether the estimated parameters are similar for different financial markets. Firstly, we applied the SMM estimation method to various high-frequency financial data such as stock market indices, exchange rates and the price of gold. The data sets all consist of daily data from 1980 to 2010. For almost all of these financial markets the Alfarano et al. (2008) ABM cannot be rejected as being the “true” moment-generating process. Still, the parameters differ across markets. For instance, in the case of the Nikkei index, our estimates indicate a purely sentiment-driven process whereas for the USD/EUR rate, we found a pure fundamental determination of the exchange rate. Only a small number of estimates are, however, close to their GMM counterparts, e.g. those for the German stock market and the price of gold. While in the case of GMM estimation different specifications resulted in mostly very similar estimates, we found those of our SMM estimates to exhibit larger variation compared to their GMM counterparts. This, presumably, indicates that even with our large simulated samples, simulated moments still display quite some variation around their analytical benchmarks. All in all, despite certain discrepancies between the SSM and GMM estimation results, received wisdom on the strength of influence of speculative behavior and sentiment in different financial markets is largely confirmed: Those most prone to sentiment-based fluctuations are markets for stocks and precious metal while foreign exchange markets appear relatively more driven by fundamental factors.
We applied our approach to the important task of forecasting future asset price volatility. Forecasts based on the estimated agent-based behavioral model are used to validate its performance vis-à-vis a more traditional stochastic approach using a GARCH specification. Using the estimated parameters of the ABM to compute best linear forecasts for future volatility, we found that the behavioral model generates sensible predictions that get close to those of a standard GARCH (1,1) model in their overall performance and often provide useful information on top of the information incorporated in the baseline GARCH forecasts. Hence, the ABM adds practical value when used in addition to the traditional stochastic econometric approach to volatility modeling, and forecast combinations on the base of both estimated models could provide more accurate predictions of future volatility than any of the models alone.
3.3 Empirical validation of a network model of interbank credit relationships
One important facet of the recent and still ongoing financial crisis that appeared on the agenda of supervisory authorities after the events of 2007/2008 is the propagation of stress through the network of interbank credit links. Little is known so far about the factors that are responsible for the particular network structure of interbank linkages that has emerged in recent years. In this regard, agent-based models for interbank lending activities have been developed. In combination with longitudinal panel data this allowed us, in contrast to previous literature, to analyze the evolution of the network from the perspective of an ABM and so provides an avenue for an identification of the driving forces behind banks' decisions about which other banks to choose as counterparties in the interbank market.
In a first study, we attempted to identify the driving forces behind banks' link formation in the interbank market by applying the so-called stochastic actor-oriented model (SAOM) adapted from empirical network research in sociology. The model has again been estimated by SMM. The data used here consists of quarterly networks of interbank credit relationships constructed from the transactions on the Italian electronic trading platform (e-MID) for interbank credit over the period from 2001 to 2010. The most pervasive finding of this empirical analysis is that past trades are the most significant predictor of future credit relations which indicates a strong role for the formation of lasting relationships between banks, i.e. what has been called “relationship banking”. It has also found that size-related characteristics of banks have significant explanatory power for the structure of interbank credit relationships, but measures related to interest rates have very limited explanatory power. This indicates that preferential interest rates are the consequence of existing long-term credit relationships. Hence, causality runs from the existence of links to interest rates, not the other way (as a naive application of microeconomic concepts would suggest). Estimating the model for the time before and after the global financial crisis shows similar behavior over the complete period in many respects, but also some remarkable differences. The major behavioral changes found for the period after the onset of the financial crisis were that: (1) large banks and those identified as “core” intermediaries became even more sought of as counterparties and (2) indirect counterparty risk appeared to be more of a concern as a higher tendency to avoid indirect exposure (indicated by clustering effects) was found from 2008 onward.
The results also receive support from an alternative factor-analytic approach. In this analysis we found one dominant factor which mainly describes the change of direction of credit flows between two groups of banks. While for most of the period large banks seemed to collect surplus liquidity from smaller banks, this flow has reversed from time to time. Two more factors exist which describe peaks of activity in the middle of the year and a vague structural change that happened at the end of the second quarter of 2006. These are likely caused by the collapsing housing bubble in the US and the changing monetary policy of the FED around that time. Our analysis reveals a significant level of synchronization of lending activity between banks, which supports the findings of our first study, and can thus help to predict the effects of liquidity shocks on the market.
3.4 Modelling and validation of the interaction between the financial and the real sector
This section intended to extend the previous models by, in particular, linking their financial or banking sector with real economic activity. The resulting financial-real interaction aims to provide a formalization of Minsky's financial instability hypothesis (FIH). Minsky's FIH is a theory of how an economy endogenously generates a financial structure that makes it susceptible to financial crises. Moreover, his theory put the lending-borrowing behavior at center stage and highlighted the negative consequences of loosening lending standards for the well-functioning of the credit market and the real economy (Minsky, 1977). Inability to service debt on the part of the borrowers was seen as a major threat to the sound operation of financial markets. At the macro scale, an unjustified easing of lending standards over extended periods can be seen as a reflection of a severe underestimation of risk. Once the neglected risk materializes itself a costly severe financial crisis would ensue, causing systemic repercussions and undermining the overall stability of an economy.
In contrast to most of the previous literature, we did not focus merely on the theoretical implications of the FIH, but also tried to test aspects of the FIH with real world data. The agent-based approach allows for a concrete specification of the expectational elements of Minsky's FIH, thus allowing us to overcome the ad-hoc nature of various formalizations of Minskian dynamics in the contemporaneous post-Keynesian literature. One major problem in the estimation of the structural parameters of a Minskian model is the lack of empirical measures of financial fragility and the distribution of financial attitudes. To overcome this lack of empirical measures, an appropriate ABM had to be formulated and an empirical strategy of validation of the main facets of the FIH had to be developed.
Firstly, we have approached Minsky's financial instability hypothesis via survey data on the evolution of bank lending standards in the United States. This analysis is based on an ABM with loan officers as agents. Herding among loan officers is seen as a potential source for the staggered changes in the terms of lending. Lenders may tend to miscalculate macroeconomic risks in their decision-making process on lending standards due to the fact that they find it difficult to deviate from the dominating conventions governing the lending standards of their peers. In this way, lending standards might not only reflect objective assessments of risk, but might also be influenced by herd effects and pertinent dynamics. The estimated model incorporates such factors and provides empirical evidence on the importance of certain backward-looking variables in shaping future lending attitudes. As it turned out, the empirical estimation of the agent-based model strongly supports the hypothesis derived from Minsky's framework that peer-group pressure or herding within the population of traditional creditors may be responsible for the pronounced inertia in the lending standards. Potential policy implications are either to address pre-crisis easing of lending standards by adapting lending regulations or to compensate abrupt drops in credit supply due to tightened lending standards by swift and strongly expansionary interventions of monetary policy.
The next step in our analysis consisted in the design and empirical implementation of a model of simultaneous real and financial market dynamics taking stock of the insights from our preceding experiences in modeling various facets of the financial sector. To this end, we have developed an endogenous business cycle model with interactions between the real sector of the economy and the stock market. A dynamic expectation formation process of interacting agents in the real sector gives rise to a strong non-linearity and is responsible for the emergence of endogenous business cycles in the model. Particular attention is devoted to the stylized fact that the stock market in the US has been strongly pro-cyclical in the presence of a counter-cyclical monetary policy. We show in the context of such a model that a timid or ineffective monetary policy allows the stock market to be dominated by the fluctuations of profits in the real sector. We could confirm that time series of the US real sector (Panel A) follow a similar pattern like the simulated time series of our model (Panel B). We model the potential ineffectiveness of monetary policy in terms of an endogenous risk premium. The empirical validation shows that the model is able to simultaneously fit a number of key properties of both real and financial data. The main conclusions for monetary policy are, first, that central banks need to react relatively aggressively to high inflation and output gaps and, secondly, that monetary policy has to account for factors, that have not been traditionally in its focus e.g. the currently prevailing risk premium, that potentially affect borrowing conditions.
WP 4: Incomplete Financial Markets in Macroeconomics Models
Incomplete financial markets have not adequately been considered in traditional macroeconomic models before the financial crisis. These models, therefore, were unable to explain or forecast the financial crisis by their very construction. The objective of this work package has been to incorporate the role of financial frictions and balance sheet structure and/or a portfolio choice problem between different financial assets into a traditional macroeconomic framework. In this part of our project, the macroeconomic modelling framework considered resides, generally, within the class of Dynamic Stochastic General Equilibrium (DSGE) models.
4.1 Methodological improvement of portfolio solution methods for DSGE models
Our first objective has been to contribute to the development of analytical methods for proper incorporation of financial components via portfolio choice problems: the question how - and for what reasons - agents hold their assets and liabilities when they have multiple assets with different risk characteristics available. Until recently, little attention has been given to portfolio issues and portfolio theory in standard macroeconomic models, not least because of the technical difficulties in dealing with portfolio optimization when using conventional methods such as local approximation (perturbation) methods for solving stochastic general equilibrium models. In principle, portfolio allocation problems can be solved with global approximation methods that give a high degree of accuracy relative to the true solution. However, such global methods naturally suffer from the curse of dimensionality when the economic model under investigation is large and has several state variables.
In order to set the stage for policy-relevant applications, we have compared recently proposed perturbation-based portfolio solution methods (Devereux and Sutherland 2010, 2011) with global solution methods. This helps understanding when local methods (that are computationally cheap and, unlike global portfolio solution methods, can be used in solving the medium-scale macro models used for evaluation of monetary or fiscal policy nowadays) can be applied or when the use of local portfolio solutions can become problematic. As a test suite the model covers specifications that broadly capture features of international financial relations between advanced economies, and between advanced and emerging economies. Both symmetric country setups and asymmetric setups have been considered that capture important empirical facts such as differences in macroeconomic volatility, differences in portfolio composition, and high equity premia. We found that the local method performs well at business cycle frequencies, both in the symmetric and asymmetric settings, while significant differences arise at long horizons in asymmetric settings.
4.2 Determinants of the structure and risk profiles of portfolios in a general equilibrium framework
With the finding of a good performance of local methods at hand, we have evaluated the two main approaches in the literature in their ability to capture the non-normal features of asset returns. In particular, we have compared the performance of two solution approaches that belong to the class of local approximation methods - the approach of Judd and Guu (JG, 2001) and the approach of Devereux and Sutherland (DS, 2010, 2011). We have set up and parameterized a stylized two-period model to match the mean, standard deviation, skewness and kurtosis of return data of aggregate MSCI stock market indices of various regions. The optimal equity holdings in the exact solution are shown to depend on the degree of uncertainty, and the precise form of this relationship is determined by the distributional properties of equity returns. The DS and JG solutions coincide in the limit where uncertainty vanishes, but else differ. As currently implemented, the DS approach does not account for variations in the extent of uncertainty (and its interactions with other statistical properties of returns, such as skewness and kurtosis), unlike JG. Our analysis shows that the resulting discrepancy between the DS and JG solutions can be non-trivial. This makes extending the DS solution to take into account the effect of the size of uncertainty an interesting direction for future research.
4.3 Applications to the macroeconomics of financial frictions
In an application to the macroeconomics of financial frictions we took as point of departure the seminal paper of Kiyotaki and Moore (1997), a widely-used setup to introduce financial frictions into micro founded macro-DSGE model. In this setup, a borrowing agent (representative investor) faces a collateral constraint in borrowing because of an inability to commit to repayment. We look at robustness of that setup, to, instead of the usual assumption of a representative investor, allowing for investors to be heterogeneous in their ability to borrow from collateral, and calibrating the model to the means of the quintiles of the distribution of leverage ratios of US non-financial firms. The saving household now has a portfolio choice of which group to extend its loans to. Such modified model setup with heterogeneous investors produces a more pronounced financial amplification of shocks compared to a model version where the parameters of collateral constraints are calibrated to the economy-wide average (the case of homogeneous investors). This is because investors with the highest leverage are the drivers of asset prices, not the economy-wide average. Asset price drops in response to negative productivity shocks are therefore stronger in the heterogeneous investors' model, tightening financial constraints of all investors, and leading to additional amplification.
In the light of the bust of the US housing market bubble, it seems an important question to analyse how monetary and macroprudential policies can dampen the pro-cyclical effects of the housing market on the economy. In this part of the project we have included a housing market with financial frictions into traditional macroeconomic models to examine which policies have the desired counter-cyclical effect.
At first, we have studied potential policies that could mitigate the pro-cyclicality arising from the interlinkages between current account deficits and financial vulnerabilities. We have employed a two-country DSGE model with heterogeneous households and collateralised debt. The model predicts that external shocks are important in driving current account deficits that are coupled with run-ups in house prices and household debt. In this context the optimal policy features an interest rate response to credit and a loan-to-value (LTV) ratio that counter-cyclically responds to house price dynamics. By allowing an interest rate response to changes in financial variables, the monetary policy authority can improve social welfare because of large welfare gains accrued to savers. The additional use of a countercyclical LTV ratio that responds to changes of house prices increases the ability of borrowers to smooth consumption over the cycle and is Pareto improving.
We obtained further interesting results on the macroeconomic relevance of housing markets by investigating the transmission mechanism and business cycle properties of a DSGE model with a housing market modelled to incorporate differences in distributional aspects of household leverage. We allow for heterogeneity in borrowers' ability to borrow from collateral constraints which are tied to housing values. Heterogeneous borrowers are assumed to face a different loan-to-value ratio in accordance with the quintiles of the distribution of loan performance data. We find a strong intensification of the amplification mechanism for the household debt level when we allow for borrower heterogeneity. This is different to standard models in the literature where a representative borrower faces a loan-to-value ratio equal to the mean (median) value of the loans' distribution. Moreover, we find some extra amplification for consumption in the presence of a housing preference shock. On the base of this model, we consider the implications of macroprudential policies with the aim to curb such excesses in household debt. In particular, we find that countercyclical loan-to-value ratios responding to credit-to-GDP growth would be welfare improving for all agents in the economy.
4.4 Open economy DSGE models and the yield curve
We have also integrated settings in which agents face a portfolio problem (i.e. where agents have a choice over allocating their wealth between different asset classes) into models of the international macroeconomy. A central question in international macroeconomics is why, despite an unprecedented increase in financial integration and globalization over the last three decades, countries continue to have equity asset holdings that are very biased to their own economy, a feature having become known as the “equity home bias puzzle”.
One important strand of the literature has emphasized hedging properties of the traded financial assets as a potential explanation. In particular, this literature has shown how dividend income (equity) co-moves with (non-diversifiable) labour income which strongly influences the capacity of households to diversify via different components of their overall income. Recent models have shown the importance of extending the baseline one-good international real business cycle model to a two-good structure with home bias in consumption, and, particularly, in investment goods, and the presence of several traded assets in the model economy (bonds traded in addition to equities). Models with these features have contributed significantly towards rationalizing the observed equity home bias: because of a domestic bias in investment goods and firm-level investment, this model class predicts that dividends tend to co-move negatively with labour income, thus providing a relatively good hedge for it, and a rationale for a domestic bias in equity holdings.
We have investigated the role of firms’ financial structure, countries' credit market frictions and financial shocks in determining the extent of the equity home bias. A topic that remained relatively unexplored so far is how the presence of financial frictions in credit markets and firms’ financial structure affects the hedging properties driving international portfolio choices. To shed light on this issue, we have introduced a standard financial accelerator mechanism into a frictionless two-country two-good model with firm-level investment. We find that when the degree of the home bias in goods decreases, the model predicts that the home bias in equity positions should decrease as well. Furthermore, when increasing the size of financial frictions (in particular, monitoring costs), the degree of the home bias in equity intensifies. Our results suggest that the financial structure and the size of credit market imperfections are indeed relevant factors in shaping the observed equity bias in investors' portfolios.
In a related vein, we have analyzed the effects of fiscal and monetary policy on government bond yields. Firstly, we were able to replicate the standard finding of the literature that a rise in government spending increases the yields of government bonds. Additionally, we found that increased uncertainty related to government spending increases the demand for government bonds as households need to insure themselves against potential drops in their wealth by buying government bonds. Although this decreases the interest rates on government debt, it also affects the economy negatively through the resulting drop in households’ consumption. In such a scenario, monetary policy can play a crucial role in the transition mechanism from government spending to changes of bond yields. In particular, if monetary policy is accommodative, so responding to both inflation and output, it mitigates the impact of fiscal policy uncertainty on bond prices as economic agents can expect that the central bank stabilizes output in case of a government spending shock.
WP 5: Credit Networks, Leverage and Macro Dynamics
The recent financial crisis has highlighted the importance of agents’ connectivity and of the credit network topology in the analysis of risk sharing and systemic risk. In fact, increasing the number of agents’ connections, the financial network should be less exposed to systemic risk due to the higher dispersion of risk. However, when severe shocks hit individual parts of the network, financial linkages among highly leveraged agents constitute a channel of propagation for contagion. In order to foster our understanding of the endogenous sources of instability in the credit system, we have investigated the relationship between credit supply, leverage and macroeconomic performance in a bottom-up fashion. To be able to identify the endogenous sources of instability, we have developed plausible behavioral hypotheses for agents’ individual behavior, and have designed models with financial linkages among agents such as firms and banks. With these micro-foundations at hand, we have subsequently explored the impact of agents’ behavior and credit relations on macroeconomic dynamics and on the resilience of the economic system.
5.1 Micro-foundation of agents’ behavior
Agent-based models (ABM) allow us to deepen our understanding of credit and interbank markets by analyzing the endogenous evolution of agents' interactions defined by the network of credit relationships. In this task we have (in tandem with 3.3) established a formal micro-foundation of agents' behavior to determine choices in an uncertain environment. The uncertainty in our model arises through the possibility of borrowing firms to default. In contrast to traditional macroeconomic models, we have incorporated an interbank market into our model such that a bank’s risk of default of one of the borrowers is propagated to other banks through its interbank connections.
Interest rates on loans in both the credit and interbank market include a risk premium on leverage and thus depend on the default probabilities of agents. Through this channel, network evolution depends, on the one hand, on the perceived riskiness of agents and, on the other hand, on their capacity of providing loans. In our model, whenever firms need funds to expand their production, they apply for bank loans. The investment decision of a firm depends on the interest rate applied by the bank which in turn depends on the firm's financial fragility. Because of informational imperfection, a firm can contact just a few potential lenders and can borrow only from one of them. If the selected lender does not possess sufficient liquidity to fully meet the firm's request, it may decide to borrow from a bank with excess funds in the interbank market. In this way the lending banks’ credit risk is propagated through the interbank market. This coherent set of micro-foundations is sufficient to analyze the interbank and bank-firm credit markets as a complex system of two interwoven networks. In such a system the decentralized interaction of individuals allows for substantial shifts of macroeconomic quantities at the aggregate level as the outcome of a self-organizing process, leading to the endogenous formation of business cycle fluctuations, financial bubbles and volatility clustering.
We combined our micro-foundation of agents' behavior with calibration techniques that helped us align the theoretical model to empirical data. To this end, we developed a new calibration procedure designed to validate ABMs. The proposed calibration procedure identifies the set of model parameters by minimizing the sum of the squared residuals between the observed and simulated market prices at a given date. After demonstrating the validity of this calibration procedure in Monte Carlo simulations, we have also tested its efficiency in generating predictions for key macroeconomic quantities. We have not only described aggregate stylized fact on the base of this model, but also explored its potential for forecasting financial indices. We applied this approach to selected stock market indices for the banking industry. While we found little change in their strategies over time, our estimation indicates that the level of risk aversion has increased significantly with the onset of the financial crisis.
5.2 Credit network relationships
Credit networks play a crucial role in diffusing and amplifying local shocks. Following the network-based financial accelerator approach (Delli Gatti et al., 2010; Battiston et al., 2012), an ABM reproducing an artificial credit network populated by heterogeneous firms and banks has been designed. In this framework, firms have access to both internal resources as well as credit from banks. Hence, the credit network is composed of the credit agreements established among firms and banks. Production of firms is subject to idiosyncratic demand shocks. If a firm increases its leverage, its expected production and profits are increasing, but simultaneously the firm’s exposure to negative shocks increases as well. Moreover, higher levels of target leverage are associated with higher interest rates on loans and a higher probability of credit rationing. Relying on a higher number of lending banks may reduce the threat of credit rationing, but increases the firm’s transaction costs. Therefore, firms are facing various trade-offs with respect to the level of their leverage. Consequently, agents' choices of their target leverage determine both the evolution of the credit network and aggregate output dynamics.
This model was calibrated on a large data set of bank credit extended to firms listed in the Japanese stock exchange from 1980 to 2012 so as to reproduce the levels of leverage, connectivity and output volatility observed in the data. The simulations of this model generate endogenous pro-cyclical fluctuations of credit and connectivity. Indeed, during the first periods of expansions, banks were found to increase their net worth because they lend to relatively robust firms and only a small number of defaults occur. Consequently, bank supply of loans is increasing, leading to an increase of leverage and connectivity as well. However, high leverage raises firms’ default risk and high connectivity amplifies the effects of local shocks. As a result, aggregate credit leverage and connectivity are positively correlated with the overall number of defaults. Hence, during expansionary phases, aggregate credit, leverage and connectivity are all increasing, creating the seed for a future recession and increasing the probability of large slumps of output.
5.3 The impact of the agents’ behavior and credit relations on the macro dynamics
Credit markets – and in particular the banking system – have played a major role not only in modern economies. A comparison between the great depression of 1929 and the great financial crisis of 2007-2008 offers the insight that financial intermediaries have acted as amplifiers for the effects of a downturn of the real economy during the great depression while they were even in the epicenter of the crisis in the recent financial turmoil. Based on the insight that in the latter crisis the banking crisis occurred before the recession, we have developed an ABM of financial and economic networks to scrutinize the interactions between the financial and real sectors of an economy by focusing on the relevance of the credit markets for overall systemic stability.
In contrast to standard macroeconomic models, this ABM incorporates a multi-country framework with endogenous incremental technological change. In each country firms produce a consumption good that is sold internationally using only labor. Firms introduce incremental innovations through their own innovative efforts or exploit technological spillovers at the country-level. Macroeconomic dynamics and the evolution of a country’s competitiveness depend on simple adaptive rules that govern the agents’ behavior and interactions. We showed that in this setting systemic risk is strictly associated with aggregate statistics of credit dynamics, in particular with variations in credit, leverage and credit network connectivity. Moreover, we explored the effect of different strategies of fiscal policy, showing that fiscal regimes, that are not excessively restrictive, tend to reduce systemic risks and increase convergence among countries. However, they may transform private risk into public risk by increasing sovereign debt.
In light of the important role of unconventional monetary policy during the last crisis, we explored its role for macroeconomic dynamics. We included two significant innovations in our ABM. We included a sector producing capital and a financial accelerator mechanism borrowing from the standard DSGE literature of financial frictions. Banks were assumed to set interest rates taking into account the quantity of credit demanded by firms, but they also consider the net worth of the firm as a criterion for their decision on interest rates. Since net worth is used as collateral, it dampens the increase of interest rates for a higher volume of credit. Furthermore, with respect to the standard financial accelerator framework, our set-up takes into account the full heterogeneity of firms leading to a network between firms and banks according to a decentralized matching mechanism. In the next step, we used our model for policy purposes. We analyzed the role of unconventional monetary policy after a large scale crisis in comparison with a scenario in which the central bank behaves according to a standard Taylor rule. According to our model, a zero lower bound policy of the short term interest rate appears as a powerful tool to smooth the effect of a strong recession, at least in the short run, and at the same time it can prevent a “double dip” recession scenario where a short recovery after a crisis is followed by another recession.
5.4 The analytical solution of the macroscopic system dynamics
This task somehow deviates from the remaining schedule of this Work Package. Our aim here was to strengthen the methodological foundations of ABMs by advancing their theoretical penetration. To this end, we have introduced an analytically tractable agent-based herding model. We have used a modification of the model introduced by Kirman (1993) to explain herding behavior in ant colonies when they gather food from two identical sources in their neighborhood. This model has a long tradition in serving as a prototypical framework for the interaction of a large ensemble of agents (e.g. speculators in a financial market) and has also served as a testbed for the development of methods for the estimation of ABMs in Work Package 3. Alfarano et al. (2005) have generalized this model by considering asymmetric transition probabilities while providing a closed-form solution for the distribution of the returns resulting from the price dynamics in a model with heterogeneous speculators. However, in these two models, the agents make their decision about the source by choosing with a probability that is constant in time. The model proposed here allows agents to modify their probability of choosing between the two strategies but despite this generalization it preserves the analytical tractability of the Alfarano et al. model. This is done by introducing stochastic dynamics for the probability of choosing one strategy rather than the other. This model mimics learning behavior which goes beyond a mere herding mechanism and is capable of capturing the interaction between two major groups of agents: fundamentalists and noise traders. An empirical analysis of the S&P 500 index in the period March 16 - September 27, 2016 illustrates the application of this generalized framework. This period includes the Brexit referendum so highly volatile index values are observed. High frequency data and a recent volatility estimator are used to estimate the fraction of noise traders in financial markets and, consequently, to calibrate the model.
5.5 Micro-macro modeling of capital markets
In the last task of WP 5 we have adopted a model from physics to study financial markets, the so-called Ising model. This model serves as a starting point to an important branch of ABMs which is based on a parallel between ferromagnetism and market dynamics, i.e. an arrangement of agents (spins) on a lattice interpreted as a model of agents’ behavior in financial economics. In the same way as the spins, the agents are influenced by (make their decisions based on the observed behavior of) their neighbors in the network, or agents with similar beliefs, but also by the overall market sentiment and activity. As demonstrated earlier, the model is able to generate the key stylized facts: fat-tailed returns with exponential decay of the serial correlation structure, aggregate normality, volatility clustering and power-law decay of the serial correlation of squared returns as well as specific multi-fractal properties. Still, the results and ability of the model to recover the financial stylized facts is often very sensitive to the parameter choice. Only a narrow range of parameter values yields reasonable results and outside this range the model usually breaks down in the sense that it converges to a very stable `magnetization´ and thus an equilibrium price which results in zero returns.
We contributed to this topic by analyzing the implications of the financial Ising model for capital market efficiency. We have investigated how the parameters of the model influence the dynamics of returns in the light of the efficient market hypothesis (EMH). We were particularly interested in combinations of parameters which yield an efficient market or dynamics close to it. We showed that the effects of parameters are more complicated than expected and their influence is apparently non-linear with a special role of what is called the “critical temperature” of the Ising system. These results shed new light on the EMH which is usually presented as a hypothesis with unrealistic assumptions. However, these assumptions on the information agents can access and process are only sufficient, but not necessary and EMH-type behavior (in the sense of speculative rather than purely informative efficiency) can also be obtained under different conditions. We showed that in fact market frictions (to a certain level) and herding behavior of the market participants do not necessarily work against market efficiency but what is more, they can indeed lead to market outcomes that appear efficient.
WP 6: Financial Markets Dynamics and Regulation: An Agent-Based Approach
The general objective of this part of the project has been to develop more detailed agent-based models (ABM) of the financial and the real sector than the prototype models described in the previous sections. The aim here was to develop a computational laboratory that captures more facets of the interactions within the financial sector and between the financial and real sphere of the economy. Such models take advantage of the flexibility provided by the agent-based methodology in order to integrate structures and concepts largely missing in mainstream models, but nevertheless crucial for understanding crises. Besides using the developed models to gain a better understanding of the dynamics of complex economic systems in times of crisis, we examined how currently discussed and implemented financial market regulations can contribute to stabilize the financial sector. Furthermore, the potential impacts of financial market regulation on the real sector were analyzed as well.
6.1 Development of a realistic agent-based model of trading activity in a stylized financial market: Reproducing the stylized facts of asset prices
As a first step, we aimed at developing realistic ABMs of trading activity in financial markets. The main innovation of the two models we developed in this part of the project is the integration of balance sheets into the model setup. Including the decisions and constraints that are typically associated with the variables on the asset side and liabilities side of balance sheets was a crucial step in order to allow for a more comprehensive analysis of financial market fragility. Some of the key developments unfolding before (e.g. an increasing reliance on short-term debt) and during (e.g. debt deflation) the recent financial crisis were the consequences of financial market participants’ efforts to optimize their balance sheet structure and to deal with constrained balance sheets.
The key findings of our first model can be summarized as follows. Within the agent-based setting, the empirically observable log-normal distribution of bank balance sheet sizes emerges naturally and higher levels of leverage lead to greater inequality among agents. Furthermore, greater leverage increases the frequency of bankruptcies and systemic events. Credit frictions, which are defined as the stickiness of debt adjustments, are able to explain a key difference in the relation between leverage and assets observed for different bank types. Lowering credit frictions leads to an increasingly pro-cyclical behavior of leverage which is typical for investment banks. Nevertheless, the impact of credit frictions on the extent of fragility of the model’s financial system is complex. Lower frictions do increase the stability of the system most of the time, while systemic events become more probable. In particular, we observe an increasing frequency of severe liquidity crises that can lead to the collapse of the entire financial system. As we could also show appropriate regulation can reduce the probability of systemic events when credit frictions are low. The introduction of a lender of last resort which provides credit to banks when short-term debt from other sources dries up significantly decreases the probability of a systemic event in the model economy. For the case where the maturity of debt on agents’ balance sheets is very short-term the frequency of systemic crises decreases from above 10 percent of simulation runs (a time horizon of four years is simulated in each simulation run) to about 4 percent. When furthermore including an entity which gradually unwinds bankrupt banks the vicious circle of fire sale dynamics is partially broken which further decreases systemic risk.

The second model extended the degree of balance sheet detail contained in the first model. Besides containing different forms of debt (i.e. deposits, short-term and long-term debt) it includes two different bank types: “commercial banks” and “investment banks”. While the business model of investment banks was responsible for much of the propagation and amplification of the initially relatively small and local shock in the US subprime mortgage market, it is arguably the business model of commercial banks that has the greater impact on the real sector. By including both bank types into our model, we were able to obtain a clearer picture of not only the transmission channels for shocks within the financial sector, but also between the financial and the real sector. We investigated the intrinsic dynamics of our models through systematic simulations. We found that changes to the default rates on loans in the real sector, which have a direct effect solely on commercial bank agents, also have a spillover effect on investment bank agents. Also, changes in investment banks’ risk aversion do indirectly affect commercial bank agents. Thus, the interbank market constitutes a link between the two types of banks.
6.2 Application of the model
This part of our research was concerned with the application of the previously developed ABM to assess whether financial market regulations can help in alleviating shocks such as the US subprime mortgage shock. The main goal was to conduct simulation analyses with which the impact of selected regulatory reforms on the financial sector’s stability can be tested, such as the liquidity coverage ratio (LCR) of the Basel III accord which is currently in its implementation phase.
Specifically, we employed our ABM framework to assess the impact of the liquidity coverage ratio regulation on balance sheets, interest rates and monetary policy transmission. Our findings confirm existing impact assessments in that they suggest that the regulation will lead to a lower supply of bank loans to the real sector, higher interest rates and a shift towards longer term wholesale funding. When the LCR regulation is the binding constraint on balance sheets, a sharp decline in the role of the short term interbank market as a funding source disturbs the transmission of monetary policy. In particular, changes of short term central bank interest rates will be less effective in stimulating or curtailing the supply of loans to the real sector. On the other hand, we find that the lending channel of monetary policy through changes in customer deposits will be slightly more effective under the LCR regulation. Furthermore, we have evaluated the impact of a confidence shock and a solvency shock on the loan supply to the real sector. A large and unexpected shock to confidence, which we model by a temporary increase in the perceived default probability of the commercial bank agents, leads to a severe credit crunch under the LCR regulation. While the regulation has a stabilizing effect on commercial banks, it decreases the stability of investment banks, who are the creditors of commercial banks in the wholesale debt market. A sustained decline in the supply of wholesale funding in response to the confidence shock is ultimately responsible for the credit crunch. In contrast, the LCR regulation does alleviate the immediate adverse consequences of a solvency shock on the loan supply. However, the positive effect is rather modest and short lived. Lower average profit rates of commercial bank agents lead to a slower recovery and eventually to a detrimental impact of the LCR regulation on loan supply.
6.3 Incorporating the real sector into the model
An economic model is always a strong simplification of reality. Bringing economic theory to the data is therefore a major challenge regardless of the methodology employed. In comparison to typical agent-based or equilibrium models found in the literature, the models we have developed in the previous two tasks are relatively large. In the context of calibration and validation, this implies both advantages and disadvantages. Clearly, the advantage lies in the dimension of heterogeneity a model can reproduce. Banks, the agents in our model, differ in many regards, which needs to be taken into account when trying to replicate the dynamics of the real financial system. The most apparent difference lies in the structure of their balance sheets. It matters, e.g. for financial stability, how balance sheet size is distributed among banks and how balance sheets are funded. Matching observed micro data of real banks to the representation of banks contained in our models becomes easier the more realistic the model structure is. The disadvantage of having a large model pertains to the increased number of parameters that cannot be directly observed in the data and need to be estimated/calibrated by matching aggregate dynamics. For example, the degree to which agents disagree about the fair value of an asset cannot be directly inferred from data. Instead, the parameters of the expectations formation process contained in the model will have to be chosen in such a way that resulting price dynamics produce a good fit with empirical data. However, specific parameters seldom only affect one part of the model. Choosing the parameters to maximize the fit of not just one part of the system, but the system as a whole is challenging.
In Task 6.3 we have developed concepts to match as many parameters of our model as possible to micro-data. Bankscope is our primary data source on balance sheet information of banks registered in the European Union. The data set includes 2,842 entities with a median (average) size of 652.57 million USD (17002.07 million USD). A number of steps are necessary in order to match the virtual balance sheets of the agents in our model to the data. These steps include merging similar balance sheet positions into categories that are contained in the model as well as splitting the bank holding companies in our database into two separate entities that represent the commercial banking and investment banking arm of their business. Most importantly, however, we have developed concepts to match the characteristics of individual balance sheet positions to the parameters of our model. In order to integrate liquidity into our setup, we consider the situation in which a group of investors specialized in trading an asset is forced to sell that asset. When the usual buyers are not available, it falls to outside non-specialized investors to absorb the excess supply. We assumed that these outside investors demand a higher return which translates into the price impact. In the context of the model, the group of specialists represents the investment bank agents, while a rest-of-the-world agent takes the role of the outside investors.
The main goal of this calibration was to provide a foundation for modelling the interaction of the banking sector with the real sector. The general purpose of developing a macroeconomic model is twofold: First, by including the interaction between heterogeneous economic agents and a more realistic integration of uncertainty in the expectation formation process of agents, we wanted to facilitate a comprehensive analysis of household and firm behavior during a crisis. Second, we wanted to address a more general epistemological question. By developing a macroeconomic ABM that adopts as many assumptions as possible of its mainstream DSGE siblings, we were able to provide a concrete comparison between the two methodologies. Ideally, the ABM can, as a special case, reproduce the key outcomes of a standard DSGE model. We showed that when there is no uncertainty about a household agent’s income (this is a simplifying assumption of the New-Keynesian Model) the ABM and its DSGE counterpart display similar consumption behavior for a broad range of interest rates. Our ABM, where households choose their consumption level according to a buffer stock saving heuristic, also displays quite similar behavior for interest rates. Thereby, we have shown that it is possible to build an ABM which reproduces the key outcomes of a DSGE model while being more flexible with respect to the underlying assumptions.
WP 7: The European Dimension: Consequences for European Monetary Policy
This part of our project explored in how far the new approaches developed in our research could shed light on important policy challenges the European Union (and in particular the Eurozone) has been facing. We believe that herd behavior, sentiment dynamics and multiplicity of equilibria can be captured naturally with the methodological approaches advanced in this project. These factors are potentially important elements for an explanation of past and ongoing events, and they interact in a non-trivial way with the design of the Eurozone. The objective of this chapter was therefore to explore the explanatory power of various new approaches investigated from a more abstract and general perspective in the previous chapters for shedding light on the particular problems facing the Eurozone.
7.1 Empirical analysis of the debt dynamics of the Eurozone
We empirically analyzed two main problems which European monetary policy has been facing since the European debt crisis: animal spirits and irrational exuberance in the Eurozone sovereign bond markets. Our study uses panel data methods to test competing hypotheses regarding the forces that market sentiments exerted on monetary union member countries, and the influence of market sentiment on the decision of adopting austerity measures. In particular, we tested two theories of the determination of the government bond spreads in a monetary union. The first one is based on the efficient market theory. According to this theory, the surging spreads observed from 2010 to the middle of 2012 were the result of deteriorating fundamentals (e.g. domestic government debt, external debt, competitiveness). The implication of the efficient market theory is that the only way these spreads can go down is by improving the fundamentals, mainly by austerity programs aimed at reducing government budget deficits and debts. The second theory, while accepting that fundamentals matter, recognizes that herding behavior based on fear and panic can have dramatic effects on spreads. Such behavior can drive the spreads away from underlying fundamentals. The implication of that theory is that while fundamentals cannot be ignored, there is a special role for the central bank in that it has to provide liquidity in times of market panic to avoid that countries are pushed into a bad equilibrium.
We tested these theories and concluded that there is strong evidence for the second theory. In particular, we found that the high spreads over the period 2008-2012 are caused to a large extent by negative market sentiment. Only in a few cases (i.e. Greece and Portugal) fundamental factors did play a major role. In contrast, the ECB’s OMT program led to a pronounced turn of sentiment from negative to positive leading to more convergence of interest rates than would have been warranted from a fundamental perspective. The policy implication is that the role of the ECB as lender of last resort in the government bond markets has indeed been an important one. We conclude that limiting the capacity of the ECB in this regard would come at the risk of undermining this role and by the same token that of the stability of the Eurozone.
7.2 Agent-based analysis of the fragility of the Eurozone
On the theoretical end, we have produced an agent-based model (ABM) which connects financial frictions to the supply side of the economy to study the fragility of the Eurozone. It has long been recognized that the supply side and financial markets can each be powerful transmitters of shocks. However, not too much effort has so far been invested in understanding the interaction between them. In particular, it is less well understood in what ways and under what conditions the supply side can work as a propagator of shocks generated in the financial sector, or more generally of financial shocks. Nonetheless, a number of recent empirical studies suggest that the reason behind the severe contraction during the Great Recession is a sharp drop in aggregate supply, despite the fact that the original negative shocks were generated in the financial sector.
Our ABM is extended to include a financial friction on the supply side. Firms finance investment using external funding, but need to pay for it in advance. In addition, firms’ financing constraint and net worth are determined by stock market prices which can (and will) deviate from the fundamental value. The result is that production, supply of credit and the share that firms pay to capital producers heavily depends on the stock market cycles. During phases of optimism, credit is abundant, access to production capital is easy, the cash-in-advance constraint is lax, risks are undervalued, and production is booming. But upon reversal in market sentiment, the contraction in all these parameters is deeper and asymmetric. This is even more evident in the behavioral model since cognitive limitations of economic agents result in exacerbation of the contraction. As it turns out, the behavioral model matches much of the data, including the interest rate, inflation, firm credit, firm financing spread, and bank net worth. It is also successful in matching several supply-side relations (capital-firm credit, inflation-interest rate) as well as their autocorrelations. The results from the empirical validation are favorable to the behavioral model. Thus, our model is capable of explaining booming production during phases of optimism, but also that markets are fragile when market sentiment reverses. Being in line with several empirical studies for European countries, our model shows that a weak and anemic supply side can be the key driver for a recession. It works as the propagator for shocks originated elsewhere, e.g. in the financial market.
7.3 Evaluating European monetary policy during the sovereign debt crisis
There is a long line of empirical research highlighting a strong link between firm characteristics, corporate finance structure and monetary policy transmission. This literature shows that the effectiveness of monetary policy and the asymmetric impact it will have on the economy is dependent on the type of firms in the economy and the composition of the funding sources. Understanding this link has become even timelier in the current context of unconventional monetary policy since the interest rate is at the zero-lower bound and an enormous amount of liquidity has been injected into the banking system. Yet the willingness of banks to lend to firms, in particular to SMEs has been (in relative terms) weak. This is true for both the Euro Area and the UK, but to a lesser extent also for the US.
Therefore, the question of whether the composition of the firm sector matters for the effective transmission of monetary policy has re-emerged. We analyzed this question by extending the ABM of the previous section. In particular, while we maintained the effects of monetary policy on loan demand and supply, we believed that assuming that agents understand the entire structure of the economy and can form decisions using the full information set is too strong. We therefore relaxed it and assumed that agents in the economy are boundedly rational in the sense that they learn from the past and rationally optimize in their own sphere, but hold incomplete information regarding the aggregate structure of the economy. We found convincing evidence that the monetary transmission channel is stronger in the bank-based system compared to the market-based system. Impulse responses to a monetary expansion are about twice as large as those in the market-based framework. The (asymmetric) effectiveness of monetary policy in counteracting busts is, on the other hand, relatively higher in the market-based model. The statistical fit of the bank-based behavioral model was also improved compared to the benchmark model. Lastly, we found that a market-based (bank-based) financing friction in a general equilibrium setting generates highly asymmetric (symmetric) distributions and more (less) pronounced business cycles.
7.4 Proposals for reform of the Eurozone
The design problems of the Eurozone have been recognized quite late and have induced the Eurozone policymakers to apply problematic macroeconomic policies since the eruption of the sovereign debt crisis. This has led to a dismal economic performance of the Eurozone countries compared to the EU countries that decided not to be part of the monetary union.
Since the sovereign debt crisis in the Eurozone, member countries have been pushed towards introducing more flexibility into labour and product markets. This drive towards structural reforms was very much influenced by the traditional theory of optimal currency areas (OCA). This theory stresses that in the face of asymmetric shocks member countries should have a sufficient degree of labour and product market flexibility to adjust to these shocks. Without such flexibility adjustment will be impossible, thereby undermining the sustainability of the monetary union.
The underlying assumption of the OCA prescription for structural reform is that asymmetric shocks are permanent (e.g. permanent changes in preferences or productivity shocks). When the shocks are temporary, it does not necessarily follow that more flexibility is the answer. More specifically, when the shocks are the result of unsynchronised business cycle movements, the appropriate way to deal with them would be by stabilisation efforts. In our empirical analysis we have provided evidence to suggest that the most significant shocks in the Eurozone have been the result of booms and busts, driven by waves of optimism and pessimism. These business-cycle movements have been relatively well-synchronised. What was not synchronised was the amplitude of these business-cycle movements, where some countries experienced much greater amplitude in their business cycles than others. In principle, these business-cycle movements could be stabilised at the national level without the need for a budgetary union. However, as the amplitude of these movements is so different, countries experiencing the deepest recessions are likely to be hit by “sudden stops”, i.e. liquidity outflows triggered by fear and panic, which forces them to switch off the automatic stabilisers in the budget, preventing them from conducting effective stabilisation policy.
We argue that an appropriate way to deal with the business cycle movements whose amplitude is unsynchronised is by introducing a budgetary union. By centralising part of the national budgets into a common budget managed by a common political authority, the various increases in budget deficits following from a (common) recession translate into a budget deficit at the union level. As a result, the destabilising flows of liquidity between countries during recessions disappear, and the common budgetary authority can allow the automatic stabiliser in the common budget to perform its role in smoothing the business cycle. However, it currently seems highly unlikely that the governance of the Eurozone will move in the direction of creating institutions capable of providing the necessary stabilisation of booms and busts that national governments are no longer able to provide.

Potential Impact:
1. Main socio-economic impact and exploitation of results
In line with the target of the SSH.2013.1.3-2 funding line, the objectives of FinMaP were exactly those outlined in the tender published in 2013: To better integrate financial markets and institutions into policy-oriented macro models and in this way provide improved or new decision support tools for monetary policy makers and regulators.
Our research included all the dimensions emphasized in the pertinent call:
• integration of binding credit constraints and liquidity freezes in macroeconomic models,
• identification of speculative bubbles and development of pertinent early warning indicators,
• exploration of the behaviour of agents under risk and uncertainty, and of the possibility of monetary policy to influence agents’ expectation formation to prevent ‘bad’ outcomes,
• modelling of financial institutions, their interactions and how they interact with monetary authorities in the transmission of monetary policy.
Our research has used a broad portfolio of methodological approaches, some of which are used traditionally by central banks, and others which are relatively near but of high relevance as newly developing decision support tools for monetary policy. Our methodological portfolio included:
• stochastic dynamic general equilibrium (DSGE) models that before the crisis have been the dominating tool used in central banks of developed and developing economies for policy analysis and macroeconomic forecasting. Our research has addressed the major limitations of these models that have come to the fore with the financial crisis if 2007/8, namely the absence of financial markets, a banking sector and financial frictions that have macroeconomic repercussions in the real sector of the economy.
• agent-based models (ABMs) that have been promoted as a promising alternative avenue in the face of the methodological limitations of the DSGE apparatus. We have developed ABMs for particular segments of the financial sector, as well as for a comprehensive model of the banking sector and its interaction with the real sphere. We have also started to use these models as a new computational platform to conduct counterfactual policy analyses.
• network models of the financial sector: This new class of models has gained enormous attention as a tool to investigate questions of systemic risk due to high connectivity within the financial sector, possible chains of contagious defaults and macroprudential policy to make the financial sector more resilient. Since 2008, this has become an important decision support tool for regulators, and a major framework for research in this area. Our research within FinMaP has been developed in close cooperation with colleagues at central banks and has contributed to the research questions identified in this area.
• experimental research on the formation of expectations in financial markets, and the problem of interaction between private and public information. The main objective of this strand of our research was to help monetary authorities to fine-tune their communication strategy and to help evaluate the work of rating agencies and related institutions issuing publicly available information signals that market participants are interpreting in the light of this private information. Such public information might play the role of a coordination device that goes beyond its material informational content, and thus can be a delicate instrument in the hand of policy makers.

The main impact of the project, therefore, lies in the development of decision support systems, new models and methodologies that are of potential usefulness for the analysis of financial markets, their interaction with the real economy, the transmission mechanism of monetary policy and the effects of regulatory reforms. These methods and models are designed to be applicable by monetary authorities and regulators. They have been developed in close exchange with monetary practitioners and have been communicated to these ‘end users’ via particularly designed workshops, and have also been presented via academic publications and presentations at international conferences.
2. Dissemination within the scientific community
Dissemination within the scientific community relied on the traditional channels of publications in learned journals and presentations in conferences and workshops. We targeted journals of high caliber and visibility as well as key annual conferences of the international scientific community to disseminate our results and were successful to this end. The publication performance of the participating groups supports the view that we were able to tackle highly pertinent and topical issues on the agenda of both academic research and economic policy-making.
Learned journals of high reputation in which output of the project has already been published include, among others:
- Computational Economics
- Energy Economics
- European Journal of Operational Research
- Journal of Banking and Finance
- Journal of Economic Dynamics and Control
- Journal of Empirical Finance
- Journal of International Money and Finance
Due to the typically very long editorial processes of academic journals in the field of economics, many more finished research papers from the project have already been submitted, but not finally accepted. We, therefore, expect the number of publications from the project in journals of high caliber to increase further over the next years.
Topical international academic conferences at which our research has been presented include, among others:
- Annual Conferences on Computing in Economics and Finance of the Society for Computational Economics
- Annual INFINITI Conferences
- Annual Workshops on the Economic Science with Heterogeneous Interacting Agents (WEHIA)
- RiskLab/Bank of Finland Conference on Systemic Risk Analytics
- European Economic Association Annual Meeting
- IFABS Conference
- International Conference on Econophysics
- World Finance Conference
3. Dissemination to stake holders
Besides these traditional avenues of academic dissemination, we also distributed our main results to the broader public and, in particular, the interested actors in the area of public and monetary policy. Following a format developed at the Kiel Institute for the World Economy (IfW), two special events (policy clinics) targeted at policy makers were organized. The format of our “policy clinics” consisted in workshops binging together researchers of our consortium with interested persons from policy and public administration. Consortium members contributed presentations on the key findings of our project and their relevance for policy on a non-technical level. To each presentation was assigned a discussant from outside the academic sector who commented on the practical implications of the research. Due to the nature of our research, the main targeted audience consisted of representatives of central banks and other institutions interested in strategies and decision support for monetary policy. Among the discussants of the two policy clinics the following institutions have been represented:
- Bank of Italy
- Bank of Spain
- Central Bank of Luxembourg
- European Central Bank
- Directorate General European Commission
- National Bank of Austria
- National Bank of Belgium
- Swiss National Bank
The policy clinics also featured open discussion rounds and further interactive elements. Due to the central role of these Policy Clinics in our dissemination strategy, we give in the following a detailed account of their schedule and content.
1st Policy Clinic, Leuven, 3rd June 2016
The first policy clinic in June at the Katolieke Universiteit Leuven had as its topic “Financial Market Distortions, Systemic Risk and Economic Policy: New Avenues and European Perspectives”. The conference presentations were related to the understanding of the complex relationships between financial distortions and macroeconomic performance, the linkages between financial markets and real economic activity and regulatory measures to cope with dysfunctionality and instability of financial markets as well as the consequences of financial friction for the conduct of monetary policy and appropriate reactions of supervisory authorities to prevent financial distress.
The first contribution was by Dr. Yuemei Ji, who shared her research on the Eurozone debt crisis and its relationship with market sentiment within WP7. In her joint empirical work with Prof. de Grauwe she finds evidence that members of the monetary union can be forced into a bad equilibrium by the financial markets. Such countries do not have control over the currency in which they issue their debt and can experience liquidity shortages in the presence of adverse market sentiment, which can then translate into insolvency risk due to increased refinancing costs. A central bank in a monetary union can provide more liquidity to the sovereign debt market to prevent such a bad equilibrium from happening. However, to ensure the long-run stability of the monetary union, this policy should be accompanied by measures for reducing debt at the level of the individual countries experiencing difficulties.
Prof. Stefan Reitz presented his research within WP2. Together with his collaborators he has studied the expectation formation processes in the US stock market. Expectation formation is modeled in terms of extrapolative and regressive components, such that the importance of each component can be measured for different states of the market (during booms and busts). Their results suggest that heterogeneous expectations and their structure over the cycle play a key role for the emergence of the observed stock market dynamics. Thus, approaches that explicitly model and allow for heterogeneous expectations should be used for effective policy analysis.
Dr. Eddie Gerba presented results from two papers coauthored with Prof. Paul de Grauwe based on an extended New-Keynesian modeling framework developed within WP7. The authors consider a New-Keynesian model with behavioral features as well as a more standard rational expectations version. Their extensions aim at introducing supply side financial frictions and a banking sector to the framework and contribute to a better understanding of the interactions between the real and the financial sectors, and thus, potentially, to a more effective policy analysis. In particular, in this context, a market based financing system is compared to a bank-based one in terms of the implications for monetary policy transmission and the propagation of shocks. Their analysis clearly indicates the superiority of the behavioral New-Keynesian model over the rational expectations one in terms of its ability to capture the observed regularities in macro data. The model is capable of explaining booming production during phases of optimism, but also that markets are fragile when market sentiment reverses. The model also shows that a weak and anemic supply side can be the key driver for a recession, and how financial frictions affect the channels of monetary transmission.
Prof. Simone Alfarano discussed results from his paper on public communication and disclosure strategies employed by policy bodies (research conducted in WP1). These issues are studied in the context of controlled laboratory experiments. The key finding from this research relate to the effects of public information on the coordination process taking place on markets (in this case a stylized stock market). Public information is common knowledge and is observed to bring about coordinating behavior, as the agents systematically overweigh its importance against private information. In the context of the experiment, this reliance of the agents on public information does not appear rational since the agents have no incentive to coordinate. Public information is thus liable to bring about a coordination of behavior over a particular outcome under very general conditions. This outcome might be undesirable, in particular if the public signal is imprecise or systematically biased. The author proposes the introduction of multiple independent public signals to avoid coordination over undesirable outcomes and shows that in the context of the experimental setting this works well.
Dr. Federico Giri presented his work within WP5 on early warning indicators for financial instability on the macroeconomic level in the context of an agent-based model with interactions between firms and banks. Such research can be used for the development of effective macro-prudential policies. The structure of the dynamic credit network resulting from the agent-based model has strong implications for the vulnerability of the system to idiosyncratic shocks. Because of this, it is strongly recommended that particular network measures or statistics are included to the list of early warning indicators. In particular, measures of interconnectedness and relative size of the banks seem to capture potential vulnerabilities well.
Finally, Mr. Jesper Riedler shared insights from his research on financial regulations in the context of an agent-based model of the banking sector (research within WP6). In particular, he analyses the effects of a minimum “liquidity coverage ratio” on various aspects of the banking system. His research suggests that banks would be able to comply with this regulation without great difficulties. On the positive side, banks would need to pay more attention to maturity mismatches between assets and liabilities and short-term interbank debt would no longer be used to finance long-term loans. On the negative side, wholesale refinancing would become more expensive and the contagion between debtors and creditors would increase. Finally, the research predicts that the positive impact of this regulatory measure on the loan portfolio of the banks would only be temporary.
The presentations of FinMaP research output have been discussed and commented on by Dr. Werner Roeger (European Commission, DG-EFCIN, also member of the FinMaP Advisory Board), Dr. Romain Baeriswyl (Swiss National Bank) and Dr. Pablo Rovira Kaltwasser (Central Bank of Belgium). The policy clinic concluded with a panel discussion on the prospects for joined monetary and fiscal policy in the Eurozone. Next to the coordinator of Work Package 7, Paul de Grauwe, the former president of the European Council, Dr. Herman van Rompuy, the head of the forecasting center of the Kiel Institute for the World Economy, Dr. Stefan Kooths, and Prof. Mario Pianta from the Universita di Urbino contributed to the panel discussion.
2nd Policy Clinic, Rome, 29th November 2016
The second policy clinic that took place at the Sapienza Universita di Roma was entitled “New Avenues for the Analysis of Financial Regulations”. It was devoted to new approaches for modelling bank behavior and the intrinsic network structure of the financial sector, its implications for financial stability and the design of optimal financial regulations in the context of network theory and agent-based models. Finally, the interplay between the real and financial sectors of the economy was examined through the lens of bank-firm credit networks. A keynote lecture on the practice of macro stress testing was contributed by Christoph Siebenbrunner from the National Bank of Austria, followed by presentations of the results from our project and discussions.
The first presentation by Mr. Christoph Siebenbrunner (Central Bank pf Austria) set the stage for the overall topic of this event. Mr. Siebenbrunner provided details of the experiences at the Bank of Austria on the functioning of stress tests for banks. Top-down stress test are generally carried out at a high level of aggregation and thus lack the necessary degree of detail to assess risk coming from the structure and interactions of the exposures banks have. Bottom-up stress test, on the other hand, have the potential to capture such effects since they are done in close collaboration with the individual banks which can then provide the necessary amount of details. However, bottom-up stress tests tend to be very costly to perform and the aggregation of the results across banks is difficult. Mr. Christoph Siebenbrunner also discussed extensions that can account for feedback effects between liquidity and solvency, as well as direct contagion effects (in the presence of interbank market data) and indirect effects coming from interactions between fire sales, liquidity and solvency.
Prof. Thomas Lux presented his research on contagion effects in a bipartite firm-bank and bank-bank credit network within WP3. He generates randomized simulated credit networks based on the empirical regularities in the observed degree and size distributions in bank-firm networks (using data from Italy). The simulations produce highly connected networks which carry the risk of a complete system breakdown as a result of idiosyncratic shocks. There are three types of knock-on effects that operate in that framework. A firm’s default can trigger interbank contagion, a tightening of the borrowing constraints for other firms and, finally, can also trigger asset price deflation caused by fire sales (necessary for meeting liquidity requirements). The modeling framework suggests that the best way to increase the stability of the system is to increase capital buffers at the firm and bank levels.
The contribution of Prof. Leonardo Bargigli (research within WP5) also develops an agent-based model of firm-bank relations in which a firm can be connected to multiple banks. They use a comprehensive data set for Japanese firms and banks to do rigorous statistical calibration exercises of their model. This is a particularly challenging endeavor since in agent-based models, generally, the relationships between the various aggregate variable cannot be derived analytically and it is hard to link parameter values governing the behavior on the micro-level to a particular behavior of the system on the macro-level. The parameter estimation is thus plagued by misspecification and endogeneity issues. For these reasons the authors implement generalized least squares techniques that are more robust to such issues together with out-of-sample validation of the estimated parameters. They are able to achieve a high level of fitness for two of the endogenous macro-variables in the model.
As another research outcome form WP5 with immediate relevance for economic policy, Dr. Federico Giri presented his analysis (with coauthors) of unconventional monetary policy in the context of an agent-based model of the macroeconomy with a financial accelerator. Agent-based models allow for the modeling of crisis events emerging from the interactions of the heterogeneous agents in the model economy. Thus, such models are a natural starting point for policy analysis centered on mitigating or preventing such events. The authors find that a sharp increase in the base interest rate is liable to cause a large scale crisis and that a premature return of the base rate to its “natural” level can lead to a double dip recession. The modeling framework also suggests that monetary policy can stabilize the economy (at least in the short run) by keeping the base interest rate close the zero lower bound.
Jesper Riedler shared insights from his research within WP6 on financial regulations in the context of an agent-based model of the banking sector. In particular, he analyses the effects of a minimum “liquidity coverage ratio” (LCR) on various aspects of the banking system. This can be done in great detail because of the scope of the model. They find that under a LCR regulation the effectiveness of the marginal lending rate of the central bank is decreased and that commercial banks would attempt to keep the average maturity of their wholesale funding above the average maturity of their assets. The average return on assets for commercial banks would also generally be smaller under a LCR, particularly for longer maturities of their loan portfolios. The authors also investigate the interactions between various shocks to the financial system and the minimum “liquidity coverage ratio”. The research suggests that the implementations of a LCR would aggravate the adverse effect of a confidence shock on loan supply to the real sector, while commercial bank equity would be bolstered (particularly during the first year after the shock). For a solvency shock the difference between the reaction of the banking system without a LCR regulation and under a LCR regulation seems to be economically insignificant.
Dr. Simone Berardi presented research from WP5 on the relationship between different behavioral strategies of the agents in a banking system and its overall properties and stability. In particular, on an interbank credit market, the banks select their trading partners by using a fitness rule. The banks can choose how much importance to attach to a fitness signal in general and how much relative importance to attach to its two components (required interest rate and liquidity). In this setting, a concentration of the network in which few hubs provide lending to the rest of the network is associated with financial fragility. The best way to avoid such concentration in the model is for the banks to adopt a mixed strategy (between the two components of the signal) and not to attach too much importance to the fitness signal in general.
Finally, Angélica Domínguez gave a talk on research within WP3 on the formation and characteristics of an interbank market under the Basel III framework. The model is an extension of a dynamic network model of the interbank market developed by Lux (2015). This project provides a framework for an analysis of the stability of the financial network in a dynamic setting of evolving credit relationships and allows an analysis of the role and importance of each of the separate components of Basel III during ‘normal’ and crisis times.
The presentations of FinMaP research output have been discussed and commented on by Dr. Massimiliano Affinito (Bank of Italy), Dr. Lorenzo Burlon (Bank of Italy), Dr. Eddie Gerba (Bank of Spain, and former member of the FinMaP consortium), Dr. Grzegorz Halaj (European Central Bank), Dr. Mattia Montagna (European Central Bank) and Dr. Pablo Rovira Kaltwasser (Central Bank of Belgium).
Besides academic publications, the findings of our research project were also made available to a broader public via so-called non-technical policy briefs that are focused reports on the policy implications of new developments in theoretical or empirical research. These had the length of a lead article in the business or science pages of a quality newspaper. The policy implications of each of the Work Packages have been summarized in a policy brief to disseminate their results to a wider audience.
4. Other important dissemination activities
The consortium was also particularly encouraging the participation of young researchers at international conferences who were employed within this project at the partner institutions. Given that we spent major efforts on the development of new methodology for improving upon extant ABM and DSGE models, the application of network theory to the analysis of systemic risk and new econometric approaches to diagnose and forecast speculative bubbles, the involved young researchers were important bearers of the knowledge and human capital.
Beyond the circle of direct participants, we disseminated our results via additional avenues: First, the newly developed methods and models were taught in summer schools of scientific societies to whom the main researchers of this project are affiliated. One important venue at which most members of the FinMaP consortium (and also researchers of MACFINROBODS) regularly met and presented new results were the Annual Meetings of the Society for Economics with Heterogeneous Interacting Agents (ESHIA). These annual meetings have become the major forum for exchange on new research using agent-based models in economics. ESHIA organized accompanying summer (as well as winter) schools together with its annual meetings that have been attended regularly by about 50 young scientists over the last years.
The coordinator of the FinMaP project, Thomas Lux, has been a co-organizer and lecturer in all ESHIA summer schools that took place during the project lifetime: in Tianjin (2014), Nice (2015) and Castellon (2016). In all these summer schools, various presentations by members of the FinMaP project have disseminated the methodological results to an audience of international young scientists. Thomas Lux has also presented material related to FinMaP at schools organized by other institutions, i.e. at the Conference “Physics meets Finance” at the University of Ulm, 2015, and the Workshop “Pluralism in Economics”, Berlin, 2014, that had more of a policy-oriented agenda. Thomas Lux also gave an open public lecture on interbank network research within the FinMaP project at the University of Kiel in 2016.
There are various other societies and groups (Society of Computational Economics, the Artificial Economics conference series, among others) in which consortium members have been routinely involved and arranged for special invited sessions on topics pursued within our project. Besides specialized meetings and workshops, we targeted also the main general conferences of our field and policy-related avenues for informed discussions with policy-makers. For example, Paul de Grauwe has been the keynote speaker of multiple conferences during the project period and published articles on the USAPP and EUROPP blog of the London School of Economics. Many international conferences of highest relevance, as the European Economic Associations annual meeting (Katrin Rabitsch) or the INFINITI (Eddie Gerba, Simone Alfarano, Gabriele Tedeschi), have been visited and our research was presented. Further, project members visited public institutions such as the International Monetary Fund to give presentations and seminars (Paul de Grauwe), or have been appointed as a member of the Monetary Experts Panel of the European Parliament (Roman Horvath).
General visibility of the project was supported by the creation of a project identity with a short presentation of the project, a common template for presentations at conferences, workshops and seminars, and a logo for the representation of the project. The established website of the project contains all information on the participating groups and scientists as well as collected all the output of the project. Scientific papers and policy briefs as well as related material are provided to the broader public through this website.
Dissemination activity also included networking, particularly with other EU projects on similar topics. Some FinMaP partners participated also in other projects funded by the European Commission. For instance, the Universitat Jaume I as well as the Universita Politecnica delle Marche were participants of another FP7 project, SYMPHONY. The latter university was also a partner within the projects RASTANEWS and MatheMACS, and also had close contacts with the parallel project MACFINROBODS of the same funding line. Furthermore, individual cooperations existed with the ECB and the Bank of Portugal on project-related and policy-relevant topics. This constituted a natural bridge between different projects of similar orientation.

List of Websites: