Who believes risk can be measured?
October 8, 2022
I have long argued that risk, especially extreme risk, cannot be measured. A lot of people disagree. Who are they?
I have been arguing for two decades that day-to-day risk is, at best, reasonably well measured, and extreme risk is non-measurable, ever since my all-time favourite paper, The Emperor has no clothes: Limits to risk management. I have echoed that in a lot of blogs on these pages and now in my new book, "The Illusion of Control. Most recently, in a blog titled Why the risk you measure is probably not the risk you care about.
The typical reaction is: "of course, everybody knows that".
Usually, but not always. And because that is rather important for understanding why we rely so much on statistical risk measurements in internal risk management and financial regulations, I wanted to understand those who disagree.
So I have identified three categories of individuals that seemingly have no issues with current practices in risk management and financial regulations.
The first is the physicists. They are trained to believe the problem of understanding the universe is simply one of measurement, and they benefit from massive datasets. If the data is insufficient, build a better laser interferometer gravitational-wave observatory (LIGO), and we will know the universe better. Unfortunately, the signal from the LIGOs is very noisy and with notorious glitches arising for unknown reasons. The LIGOs tend to omit these glitches from the data, thereby missing out during an intriguing black hole event. In that respect, the LIGO data resembles markets.
When put to work in financial institutions, if risk is inaccurate, it is either because we don't have enough data, the computers are too slow, or the models are too simple. The solution to risk measurement is more data, more complexity and faster computers.
Nature doesn't care what you are up to, and the glitches in LIGO data (hopefully) don't happen because the universe is out to get you. It is not so in finance. Almost everybody is trying to pull a fast one on you. Those that aren't, have an incentive to maximise profit subject to risk estimates, which guarantees that, in aggregate, they act as if they are. Some opponents are brilliant and highly resourced.
Meanwhile, measuring the financial system changes it — Goodhart's law — so that no matter how much data one collects, how powerful the computers are, or how sophisticated the math is, one can never capture extreme risk.
The financial system is endogenously complex, unlike the physical world, which is exogenously complex.
And finally, physicists measure reality as they find it. Risk management involves predicting the future, requiring assumptions, often heroic assumptions. Meanwhile, it is generally impossible to verify the accuracy of the predictions, ex-post, unlike in physics.
Simply put, the problem of quantifying the financial system is harder than measuring the physical world. But that can be a hard pill to swallow for those taught to believe physics is the top dog.
The lawyers are on the opposite end of the spectrum. They defer to expert evidence and expert opinion when establishing facts and making predictions. For risk, then they defer to what the specified measure says, no more and no less. If risk has been defined in the rulebook as value-at-risk, that becomes the tool through which risk is assessed. There is nothing more to be said at a positive level. Law is about giving meaning to statutes and ensuring compliance.
At a normative level, lawyers assume other disciplines deliver adequate tools. Stated differently, without appreciating the complex mathematics of risk, it is easy to say, "I trust the risk managers. They have a PhD in physics from the best university in the world. So who am I to question how they do their job? Just make risk a legal concept."
And finally, the largest category of those who are fine with existing risk technology is those who don't care so long as they can do their job, collect bonuses and get promoted. For them, the nature of risk is not as clearly defined as it is for the lawyers. However, it is still a concept dominated by procedure and established practice rather than broader, more subjective and more holistic concerns. Issues that don't easily fit within the bureaucratic requirements of their position are there to be dismissed.
Suppose some member of one of those groups above comes back to me with, "Oh no, we aren't", perhaps adding, "My colleagues follow the highest ethical and professional standards". What that misses is Goodhart's Law (again) which says that the very measurement device we are using will be made obsolete simply by using it to control the system. Because the system changes when we try to control it.
And since the rulebook, by its very nature, does not allow frequent changes in technology (often less than once a decade), the risk measurements, even of day-to-day risks, are almost guaranteed to be biased in ways which increases instability and affords less protection than intended. If looking for an example, I did a paper a few years ago on how Value-at-Risk as implemented in Basel II created incentives for excessive risk taking.
If you disagree, do get in touch.
p.s.
Bank of England deputy governor, Jon Cunliffe's letter to the Treasury Select Committee 6 October 2022:
Models and risk
Bloggs and appendices on artificial intelligence, financial crises, systemic risk, financial risk, models, regulations, financial policy, cryptocurrencies and related topics© All rights reserved, Jon Danielsson,