Speech of Anita Angelovska-Bezhoska, Vice Governor of the NBRM, at Joint National Bank of the Republic of Macedonia/ECB, Seminar on statistics for participants from the region, Skopje, October 2-5, 2013
Dear guests, dear colleagues,
On behalf of the NBRM, let me express my warm gratitude to the ECB for co-organizing the event and to all participants for their preparedness to share their experiences, identify the main strengths and weaknesses and pinpoint the main challenges in their own work field. It gives me a great pleasure to welcome you at today's joint event, which I am sure will provide an excellent platform to discuss statistics, as one of the most challenging and fundamental issues for central bankers and policy makers, in general. By now, we are all aware of the fact that timely, reliable and relevant data are sine qua non for a good policy making. The higher the quality of statistics, the larger the probability for timely and proper decisions.
The recent "great recession" has brought to the fore many deficiencies in the fundamentals of the global economy, in the architecture of the global financial system, in the set-up of the policy decision making process, but also in the area of statistics which is one of the main ingredients for a good economic analysis and policy making. As the economies, markets and institutions are evolving, the pressure on statistics to "keep pace" is increasing. Hence, the occurrence of the so called "information gaps" is inevitable in a rapidly changing world. But, adverse shocks such as the recent crisis reinforce them and make them and their repercussions far more obvious and painful, than in normal times. In times of crisis, these gaps constrain our ability to recognize and react to existing vulnerabilities. Let me quote the IMF at this point of my speech: "Indeed, the recent crisis has reaffirmed an old lesson—good data and good analysis are the lifeblood of effective surveillance and policy responses at both the national and international levels".
I would like to focus the rest of my opening address on the main challenges in the area of statistics, and especially to the ones most relevant for successful fulfillment of the central banks' mandate. In general terms, the crisis revealed the need for a richer and more streamlined statistics, a mix of statistics that will serve the monetary policy and the macro-prudential policy concomitantly as the policies are intertwined. What are the main information gaps, on the backdrop of the crisis? I think the best starting point to discuss these issues are the 2009 recommendations of the IMF and the Financial Stability Board, under the G-20 Data Gaps initiative:
1. The statistics should be devised in a manner to better capture the build-up of risks in the financial sector. As we all know, the global regulatory reform basically addresses the lessons learned from the crisis that insufficient capital and liquidity buffers, non-adequate on and off-balance sheet risk coverage and excessive leverage can undermine the stability of the financial system. The role of statistics in this regard is to improve the current statistical frameworks and to design new frameworks so as to provide adequate information for detecting vulnerabilities in the financial sector. Main venues in this regard are: redesign of the set of financial stability indicators to improve their value added for assessing the health of the financial systems, measurement of aggregate leverage and maturity mismatches, coverage of tail risks, more transparency about complex structured products and credit default swaps, and improvement of securities statistics. Adequate coverage of the unregulated institutions and instruments (the “shadow banking system”) seems also a key for having a complete risk-map. The need to supplement averages, with ranges and distribution was clearly demonstrated with the crisis.
2. Cross-border financial links should be covered more properly within national financial stability assessments. In the world of fast growing financial integration and creation of large multinational financial groups, data on these networks are indispensible for a proper financial stability assessment within the country. Enhancement of information on systemically important global financial institutions, and initiating data gathering of cross-border flows seems to become a "must" nowadays for a proper assessment of the stability of national financial systems.
3. Better monitoring of the vulnerability of domestic economy to shocks is needed. It requires availability of data and proper understanding of (1) vulnerabilities which can be accessed through the sectoral balance sheets and flow of funds data; (2) screening of markets, where the boom-bust cycle might arise from, such as the real estate markets; (3) proper following of the financial and real sector linkages within an economy and (4) better information on public finances.
This list of general recommendations in fact hides a necessity for very large and energy consuming efforts in the area of statistics. Although there are many angles which should be noticed, I would stress those which I think can give us "the large picture" of what should be done.
The first lesson of the crisis was that timely, more granular and more inclusive data is needed. And let me note here that probably monetary statistics is one of the most explored examples, where the granularity and inclusiveness of data becomes the next inevitable developmental step. The aggregate monetary data should capture all sub-sectors of the financial sector, including insurance companies, pension funds, investment funds, other financial intermediaries and the shadow banking system. Also, for the purpose of tackling the issue of cross-border financial flows, consolidated data for banking or insurance groups seems inevitable. This requires a serious reshaping of the monetary statistics. I think that the Eurosystem offers a good example of concrete steps being undertaken to translate the need for granular and more comprehensive set of data, into a reality. First, the Eurosystem is developing a securities holdings statistics database. "Securities holdings represent a field where exposures are often concentrated, and a lack of sufficiently comprehensive, consistent and granular information has been identified" (Draghi 2012). Second, more detailed balance sheets information of all financial intermediaries have been developed. Third, a matrix of integrated sector accounts has been devised and is published on a quarterly basis. Fourth, quarterly information on cross-border holdings of financial assets and liabilities by euro area residents is compiled, allowing for monitoring of cross-border exposures.
The second lesson of the crisis was that data production should be multipurpose. The granularity of the data means in fact better use of the micro-foundations of the macro statistics. Albeit the need for more granular data primarily came as a result of the requirements of the macro-prudential oversight, their weight for the monetary policy is very large as well. As monetary policy-makers we want to act proactively and this requires identification of the risks in advance, which very often requires "going beyond the aggregates". Hence, both for monetary policy purposes and financial stability purposes, variety of granular data and indicators are needed, which allows for data “outliers” and “tail risks” to be noticed. It is more natural than to seek for harmonized data collection systems, serving both mandates of central banks, monetary policy and financial stability. Of course this requires extraordinary coordination among the compilers of the data. It is easier when the central bank serves as a supervisor at the same time. It is probably more complicated when the central bank and the financial sector supervisor are decoupled.
The latest statement leads us to another challenge which should not be underestimated -how to make a balance between the need for more comprehensive and micro oriented data systems and the reporting burden on reporting agents. The improved statistical system is of course a public good, but at the same time the merits of it must be weighed against the costs. The costs mainly refer to constantly increasing reporting burden. Although the increase of the reporting burden probably is unavoidable, yet the practice and the recent experience prove that there are efficient ways how to handle it. Making better use of the already existing data, streamlining the data and avoiding overlapping might mitigate the problem. Often mentioned example of this is the work of the ECB and the European Banking Authority (EBA) to reconcile credit institutions’ statistical and supervisory reporting requirements. In 2012 a classification system linking the requirements of the ECB’s monetary and financial statistics with the supervisory reporting templates was published.
In this regard, the synchronization of the data flows, which until now have been separated, serving different mandates of the central bank, reveals another relevant issue which asks for a quick solution - the issue of data confidentiality. Individual data, particularly data for supervisory purposes are treated with a large degree of confidentiality. Different national authorities have different legal frameworks which address the confidentiality of individual data. However, uninterrupted exchange of data among different users must be enabled, without jeopardizing the confidentiality principle. A good example in this respect is the EU Regulation on European Statistics as of March, 2009, where statistical confidentiality is treated in the following manner: "Transmission of confidential data between an ESS authority that collected the data and an ESCB member may take place provided that this transmission is necessary for the efficient development, production and dissemination of European statistics or for increasing the quality of European statistics, within the respective spheres of competence of the ESS and the ESCB, and that this necessity has been justified." Of course this stresses even more the importance of procedures for safeguarding the data confidentiality.
The need for higher scrutiny, higher quality, additional dimensions do not refer only to statistics which I have touched upon above. The global crisis stressed the need to "reshape" economic data as well, as it came under higher public scrutiny as well. It means they are watched and used actively by policy-makers, financial markets and general public. The example of the excessive deficit procedure mechanism and the European Commission Macro Imbalance procedure, used to detect and correct macroeconomic imbalances on the basis of a scoreboard of indicators "are adding pressure on statisticians to deliver high quality data" (Gonzales-Paramo, 2012). This highlights the need and the importance of having excellent cooperation between central banks and national statistical offices.
At the end, I hope I have managed to pinpoint the main aspects of the ongoing and evolving discussion on the main challenges for the statistics on the backdrop of the "Great recession". The crisis was hopefully once in a century adverse event, which made a revolution in the manner in which the statistics is thought of. Many issues and challenges in statistics which arose after the crisis have been tackled in the advanced world, but still there are many remaining challenges which are to be tackled in the times to come. As for the less developed countries and less developed statistical systems, the challenges are twofold. First, even absent the crisis, there was a lot of work to be done in order to converge to the systems and practices of the developed world. Of course, lots have been done and gaps have been narrowed. But the changes in the approach to the statistics in the developed economies after the crisis mean that their systems will gain additional layer of quality. Hence, the crisis has made the challenges for us even bigger, as the benchmarks for convergences in the area of statistics are much higher. But I am positive that the efforts and the energy that we invest in this will be enough to get us closer to what is currently being perceived as the best practice in the world of statistics. In fact there is no other alternative. By now we all know that the reliance on second-best solutions in statistics might entail wrong policy decisions, inability to detect and respond to vulnerabilities, and large costs for the society.
Let me wish you a successful seminar with productive and interactive discussions.