This post draws on a recent RFF discussion paper by RFF Senior Fellow Roger Cooke, where he explores these topics in greater detail. Cooke is the Chauncey Starr Chair in Risk Analysis at RFF and lead author for Risk and Uncertainty in the recently released IPCC Fifth Assessment Report.
The IPCC's fourth assessment report projecting sea level rise in 2100 of 18 to 59 cm excluded the contribution from ice sheets because the ice sheet models were not up to snuff. They still aren't, but researchers Bamber and Aspinall at the University Bristol have found a work-around: structured expert judgment (SEJ).
Over the past decade, the importance of incorporating uncertainty analysis into climate modeling has become generally recognized. The question is whether our current tools are up to the task. I've been looking at this question and my paper on it, "Uncertainty analysis comes to integrated assessment models for climate change...and conversely," has just been published in Climatic Change (available as an open access publication). My conclusion is that we need a serious investment in a new generation of of uncertainty quantification techniques to address the difficult issues that the climate problem raises. Developments in uncertainty quantification have largely been driven by large methodology investments in the nuclear sector which produced three generations of uncertainty quantification.
A truism in risk management is that every disaster was predicted by someone, sometime, somehow. The 6 April 2009 earthquake that devastated the Italian city of L'Aquila was “predicted” (actually retrodicted: evidence was adduced post hoc) by anomalous toad behavior 70 km away.