Big Banks' Risk Does Not Compute

A Bloomberg piece that cites our work, Big Banks’ Risk Does Not Compute.

No matter how fine-tuned our regulations, no matter how sophisticated our risk management, they cannot properly address the hazards that the financial system in its current form presents.

This references our paper “Why risk is hard to measure”, the VoxEU version is here and the working paper here.

Mathematicians Jon Danielsson and Chen Zhou have examined how much data would be required to get reliable estimates of either value-at-risk or expected shortfall, even in a world where the future is like the past. Suppose you wanted a reasonably accurate reading of expected shortfall – say, an estimate likely to fall within 5 percent of actual losses. For the complex portfolios of large financial institutions, this would require decades of price history on hundreds or thousands of different assets – something that simply doesn’t exist for many of those assets (many firms don’t even stay in business that long, for example). With less data the result would be illusory, offering no meaningful sense of the risk present at all.