- Written by
Zug, Switzerland. - December 21st, 2022. Economic models are the tools of decision-makers in governments and central banks. As such, they affect our everyday lives. But what is the actual track record of these models, and do they really work? Applying them in real-time and validating them on a day-by-day basis leads to a clear conclusion; they are flawed.
The economic system is based on practices and business processes established at a time that predates modern technology. Governments and regulators attempt to manage the many challenges of the economy by thinking up new policies and other measures that they then “bolt onto” this struggling banking and financial system. However, these measures are based on insights derived from outdated economic models formulated by economists that address the challenges only of their respective eras. Rarely are the questions asked, are these models providing the right advice, and do they really work? My hands-on experience tells me that they do not; or at least that the models are only applicable in very select circumstances and are not appropriate for across-the-board policy decisions.
A real-time predictive information service
In 1985, I founded Olsen & Associates as a research institute for applied economics with the goal of building a real-time information system using ‘Big Data’ that provides banks and treasuries with decision-making support. The nascent computer technology of that time made it possible to feed economic models with tick-by-tick market data to deliver real-time forecasts projecting minutes, hours, days, weeks, and even months ahead, along with risk analytics and trading recommendations through the use of computerized bots. We aimed to improve decision-making quality for traders and managers in banks, asset management firms, and corporate entities. We were inspired by the advances in the natural sciences and, in particular, new developments in meteorology. Accordingly, we had high expectations that a new generation of economic models could help such decision-makers optimize their efforts and enhance their risk-adjusted performances.
Our team of physicists and computer scientists began their work on developing a real-time information system. However, we did not anticipate the many obstacles we would encounter. For the first two or three years, we always talked about our “three-month” project, not fully appreciating the scale of our task.
Eventually, after many setbacks, we managed to get our real-time information system off the ground, and we signed up some major banks in Switzerland and Europe. At the time, leased lines were a necessity as the Internet did not yet exist, so there was no possibility of providing such a service to retail investors.
The first high-frequency finance conference
Over the years, we published many scientific papers reporting our hands-on experiences and new discoveries regarding the statistical properties of financial markets. In 1995, we organized the first high-frequency finance conference. The event was attended by over 200 scientists from around the world and was a tremendous success. We showcased a data sample containing over 110 Mio market quotes. This large data sample, much bigger than any that researchers had previously been able to access, was a wakeup call for the community regarding the challenges of analyzing large data sets and the opportunities that high-frequency data offered in terms of validating economic models and analyzing their performance via tick-by-tick data. We also demonstrated how this enabled us to sidestep the problem of insufficient data, which is so prevalent in economics.
The conference rallied the community; the participants were full of hope that economics and finance would soon be transformed into a hard science built upon a solid foundation. The hope was that, by deploying economic models in real-time and checking their performance against actual events on an ongoing basis, we could help researchers identify shortcomings and eventually empower them to improve the quality of our economic models.
In the early days of meteorology, the forecasts were something of a laughing stock. However, over time, the models evolved, improved, and became more powerful and effective. Our hope was to achieve the same with financial forecasting through the creation of strong predictive models and decision-support tools.
Unsatisfactory performance of economic models
Sadly, these hopes were not realized.
In 2001, we published the book ‘An Introduction to High-Frequency Finance’ by M. Dacorogna, R. Gençay, U. Müller, R. Olsen, and O. Pictet at Academic Press. The book assembled under one cover all the research that we had conducted and published over the previous 15 years, spanning a broad range of journals, many of them not easily accessible. The book provided an in-depth account of tick-by-tick market data, its statistical properties, and how to build real-time forecasting and trading models. The book became a standard reference manual for the rapidly growing quantitative hedge fund communities that were developing algorithms to trade the markets.
The performance of the model approaches described in the book were certainly more sophisticated than the standard Excel spreadsheet-based models that used daily data. However, the performance improvement was only 20% to 30%. This was a far cry from the major breakthrough we had hoped for, and it certainly failed to reflect the successful breakthroughs occurring in other technologies, which brought improvements of several orders of magnitude.
Classical economics offered a potential reason based on the concept of ‘efficiency’ of financial markets. This is a term economists use to refer to the fact that in financial markets all information is reflected in the prices of the assets and that it is not possible to generate excess returns. This could explain the disappointing results of any attempt to build better economic models.
Over the years, we systematically explored literally every approach we could think of - from traditional fundamental models to time series models and technical analysis. You name it, we tried it. Nevertheless, the performances of all the resulting models were poor - the best worked ‘a bit’. It was possible to achieve positive results, but only barely so. Detailed performance reports are included in the book that we published.
Today, we know that existing economic models need “hand-holding” and are not ready for 24/7 deployment. If we do run them 24/7, their performances are scrappy.
In the natural sciences, models operate around the clock. How would it otherwise be possible to use airplanes, cars, and computers? However, economic models used by policymakers do not work when deployed in real-time and cannot explain the movements of markets from minutes to hours, days, weeks, and months. They are flawed. Thus, we have to go back to the drawing board and our knowledge of first principles.
In the next article, I will reveal our key discovery and how we can build a new economic theory.
Richard Olsen is CEO & Founder of Lykke Corp