Wednesday 30 March 2011

Edge: WHEN WE CANNOT PREDICT - An EDGE Special Event

Edge: WHEN WE CANNOT PREDICT - An EDGE Special Event






DOUGLAS RUSHKOFF
Media Analyst; Documentary Writer; Author, Program or be Programmed
It's too easy to discount intuition. Pattern recognition. The almost literary sensibility through which we make sense of our world.
The narrative implicit in the nuclear plant disaster in Japan is just too striking for most humans to ignore: the nation that suffered an atomic bombing is now enduring a nuclear crisis. A particular kind of scientific orthodoxy refuses to even entertain such parallels except as evidence of psychological or cultural biases clouding what should be our reliance on the data.
But, as black swan events like this prove, our reliance on the data continually fails us. We just can't get enough data about our decidedly non-linear world to make accurate predictions. There are just too many remote high leverage points in the chaotic systems constructing our reality for us to take all of them into account. Things that seemed not to matter — or that we didn't even notice — iterate enough until they end up mattering a lot. We're better off looking at a fractal and intuiting its relevant patterns than relying on its various pieces to tell us its unfolding story. Science too often divides to understand, incapable of even acknowledging there might be a science in divining to the same ends.
The coincidence of nuclear crises in Japan, combined with our inability to predict the events that precipitated it, forces another kind of predictive apparatus into play. No, it's not one we like to engage — particularly in rational circles — but one we repress at our own peril. Science is free to promote humanity's liberation from superstition or even God, but not from humanity itself. We still have something in common with all those animals who somehow, seemingly magically, know when an earthquake or tsunami is coming and to move to higher ground.
And our access to that long lost sense lies in something closer to story than metrics. A winter bookended by BP's underwater gusher and Japan's radioactive groundwater may be trying to speak to us in ways we are still human enough to hear.

RODNEY BROOKS
Panasonic Professor of Robotics (emeritus) AT MIT, and Director, MIT Computer Science and Artificial Intelligence Lab; Founder, iRobot; Author, Flesh and Machines: How Robots Will Change Us
The avian flu outbreaks in 2007, the spike in oil prices in the summer of 2008, the financial system meltdown later that year, the volcanic eruptions of 2009, the oil spill in 2010, and now the earthquake/tsunami/nuclear emergency of 2011, let alone the political upheavals in the middle east, all serve to remind us that we are very poor at making predictions. And we are apparently very poor at assessing risks. But perhaps we can make some observations.
All of these events, one way or another, have shown us how fragile our global supply chain and transportation network really is. There are already reports of how the recent earthquake, never mind the nuclear issues, have slowed down repairs to New York City subway stations, halted production of some GM automobiles in the US (an industry still recovering from the financial meltdown), and even rippled into my own hobby project of building a credible 19th century digital computer.
Our quest for squeezing every efficiency out of our systems of production and supply have lead us to fragility rather than robustness. We have gotten short term margins at the cost of long term stability. Our quest for every last basis point in our financial results have lead us to build a system with countless single points of failure. We are vulnerable to natural disasters, unforeseen economic disasters, or clever exploiters of our systems [such as governments cornering rare earth metals supply chains, or just plain opportunistic hedge traders on rather conventional metals (which is why all nickel based batteries rocketed in price three years ago).
The drumbeat of continued unexpected failures of nature, technology, or economics, will not go away. Perhaps, however, we can take lessons from the disruptions they cause, and find a way to monetize stability over maximum possible short term efficiencies, so that our constructed civilization will be more resilient to these events.

J. DOYNE FARMER
Chaos Theory Pioneer; McKinsey Professor, Sante Fe, Institute; Co-Founder, former Co-President of The Prediction Company
Viewing the Nuclear Accident in Japan Through the Lens of Systemic Risk
Predicting risk might sound like an oxymoron, but it isn't: We do it everyday. Everyone knows, for example, that the risk of a dangerous fall on a steep mountain trail is higher than it is on level ground. Prediction of risks is more difficult, however, when they are systemic. Systemic risks occur when individual components of a system interact and collectively generate new modes of behavior that would never occur for a single component in isolation, amplifying or generating new risks.
The recent financial crisis provides a good example. Banks normally manage risk under the assumption that the financial system will behave in the future more or less as it has in the past. Such estimates are based on historical losses. This is fine under normal circumstances. But in the recent financial crisis a small drop in housing prices triggered a chain reaction that suddenly made the financial system behave completely differently, and extrapolations of risk based on historical losses became irrelevant.
Systemic risks are hard to predict. They are inherently complex phenomena, typically involving nonlinear feedback that couples together the behavior of many individual components. Systemic risks frequently occur in systems where there are neither good models nor good measurements, where theory or simulation is impossible. They often involve modes of interaction that have not been seen before, making past experience of little value. The amplitude of the resulting problem is often far larger than previously imagined possible.
How can we anticipate and minimize systemic risk? The key general principle is stability. Systemic risks occur when bad behaviors feedback on one another, so that small problems are amplified into big problems. When things go topsy-turvy, do the problem behaviors damp out, or are they amplified? In the recent financial crisis, for example, the key problem was leverage, which amplifies both gains and losses. Leverage is good during good times, but during bad times it makes the financial system unstable.
The recent Japanese earthquake/tsunami provides another example of how a normal risk can turn into a systemic risk. For Japan, given the history of the region, an earthquake in tandem with a Tsunami might be called a normal risk. But no one realized in advance that a Tsunami could destroy both the main power and the backup power of a nuclear power plant, while an earthquake could also create cracks causing a loss of coolant. The resulting nuclear catastrophe came on top of all the other damage to the infrastructure, making the nuclear crisis even harder to solve than it would have been otherwise, and the radiation leakage has made it even harder to get the infrastructure functioning again. The risks of both have been amplified.
With hindsight the consequences of a large earthquake and tsunami seem obvious, so why didn't the engineers plan for them properly? This is the usual story with systemic risk: In hindsight the problems are obvious, but somehow no one thinks them through beforehand.
As already explained, from a complex systems engineering perspective, the key principle is stability. Nuclear power generation is intrinsically unstable. If you walk away from a wind generator or a solar cell when a crisis occurs, not much happens. If you walk away from a nuclear reactor under the wrong circumstances, it can melt down. To cope with the systemic risk one needs to think through all possible scenarios. The experts might be able to plan for all the known failure modes, but it is much harder to anticipate the unknown ones.
The prognosis for nuclear accidents based on simple historical extrapolation is disturbing. After roughly 14,000 cumulative years of nuclear plant operation, we have now had three major accidents. If we ramp up nuclear power by a factor of ten, which is necessary to make a significant contribution to mitigate global warming, we will increase from the 442 reactors that we currently have to about 5000. Historical extrapolation predicts that we should then expect an accident of the magnitude of the current Japan disaster about once a year.
But I don't trust the historical method of estimating. Three events are unlikely to properly characterize the tails of the distribution. My personal choice for a really nasty nuclear scenario goes as follows: Assume the developed world decides to ramp up nuclear power. The developing world will then demand energy independence and follow suit. For independence you need both reactors and fuel concentrators. There will be a lot of debate, but in the end the countries with stable governments will get them. With a fuel concentrator the waste products of the reactor can be used to make weapons grade fuel, and from there making a bomb is fairly easy. Thus, if we go down the path of nuclear expansion, we should probably assume that every country in the world will eventually have the bomb. The Chernobyl disaster killed the order of ten thousand people: A nuclear explosion could easily kill a million. So all it will take is for a "stable government" to be taken over by the wrong dictator, and we could have a nuclear disaster.
I'm not an actuary, so you shouldn't trust my estimates. To bring the actuaries into the picture, anyone who seriously advocates nuclear power should lobby to repeal the Price-Anderson Act, which requires U.S. taxpayers to shoulder the costs of a really serious accident. The fact that the industry demanded such an act suggests that they do not have confidence in their own product. If the act were repealed, we would have an idea what nuclear power really costs. As it stands, all we know is that the quoted costs are much too low.
Danger is not the only property that makes nuclear power exceptional. Even neglecting the boost in cost that would be caused by repeal of the Price-Anderson Act, the cost curve for nuclear power is remarkable. My group at the Santa Fe Institute has collected data on the cost and production of more than 100 technologies as a function of time. In contrast to all other technologies, the cost of nuclear power has roughly remained constant for 50 years, despite heavy subsidies. This cannot be blamed entirely on the cost of safety and regulation, and after Japan, is anyone really willing to say we shouldn't pay for safety? In contrast, during the same period solar power has dropped by a factor of roughly a hundred, making its current cost roughly equal to nuclear. Wind power is now significantly cheaper than nuclear. Solar will almost certainly be significantly cheaper than nuclear within a decade, roughly the time it takes to build a nuclear plant.
To properly assess systemic risks, the devil is in the details. We can't debate the risks of a technology without wading through them. But from a complex systems engineering point of view, one should beware of anything that amplifies risk. Systemic risks are difficult to predict, and precautionary principle dictates that one should take care when faced with uncertainty.

source Edge.org

No comments:

Blog Widget by LinkWithin