We are apparently in the minority in this view. We recently acquired a copy of an unclassified 1978 research paper from Central Intelligence Agency’s National Foreign Assessment Center (now republished by the University Press of the Pacific in 2005) which examined this very issue under the title “Relating Climate Change to its Effects”. For those younger analysts unfamiliar with the misty back history of old bureaucratic battles and therefore older acronyms, NFAC was the renamed Directorate of Intelligence (DI) under DCI Turner in 1977 – a designation which lasted only until reorganization under DCI Casey’s tenure in 1981.
We think perhaps this product might have best been buried with the old name. As a paper, it is almost entirely uninspiring – a mere 8 pages of substantive text, followed by hundreds of pages of tables and statistics that are the hard copy rendition of a contemporaneous data tape, mostly consisting of temperature and precipitation measurements assembled by a university contractor on behalf of USDA. These form the inputs to a simple climate model that was intended to provide for long term predictions of weather effects given specific outcomes, such as global cooling - a key concern of the day. (However, to their credit, the designers did examine the potential for global warming as well – which speaks well of the analytic rigour of the DI under any name, even if the paper is mute testament to just how badly scientific and technical experts can be at communicating with their readership through written intelligence products.) A speculative product such as this can be expected to offer no real conclusions – rather simply serving as a possible set of boundaries within the uncertainty space of future scenarios. However, it might well have made more explicit the effects it claimed to consider within the range of those scenarios.
But again, this serves to illustrate both the waste and the foolishness of attempting a futures intelligence estimate so far into the out years. We are not issued crystal balls when we are granted entry into the profession. It also serves to illustrate the perils of the arbitrary application of quantitative analysis as a fig leaf over unsustainable judgments. The model – no doubt painstakingly assembled and hard fought at the methodological level – is by its very nature the product of 1970’s era computer science. In the face of Moore’s Law, it is therefore over 20 generations obsolete.
Let us hope that this little musing upon the history of the account gives at least slight pause to modern practitioners seeking to enshrine climate change as a permanent account for long range intelligence analysis. At a point in time when supercomputational problems that take longer than a few months run time simply are not run at all until the next generation of architecture advances its inevitable order of magnitude or more, it is after all more than a bit presumptuous to assume that any community entity would be able to beat or even match the kind of big iron thrown at these problems in the civilian science world. It also very much begs the question of what better use such resources might be put to for other intelligence accounts – perhaps in the classic roles that the IC has always employed supercomputing resources: cryptanalysis, automated signal processing, or even exploring the new boundaries of potential offered by quantum intelligence.
If nothing else, this bit of history has also more firmly reinforced our opinion that climate change issues are a matter best left to the academics. Perhaps once the Long War has been won – and given the timescale we believe will be needed to accomplish this monumental, generational task – then the community’s attention can turn more to the matter once again. And if the current crop of speculative forecasts prove correct, at that point in time the issue may properly fall within the window of an actionable long rang estimate.