/* */

07 February 2008

Revisiting analytic rigour

The research currently being done at Ohio State University into the problems of intelligence analysis – including information overload, cognitive processes, and other aspects of the methodology – has from time to time caught our interest. Among the more interesting of these items now in circulation is an excellent lecture that we most highly recommend to our readers, recorded last year during the too often overlooked Google Talks series. The discussion focuses on the evaluation of analytic rigor, and means by which analysis may be strengthened.

We particularly favour the philosophy that Dr. Woods presents, which seeks to avoid dictating a single best methodology or process. We are more than willing to listen to the methodologists, but too often we find a dictatorial approach significantly at odds with the realities of line analysis. We think that the observed case study technique used in the Ohio State team’s research – something too infrequently done by many academics – is key to the validity of their findings. One cannot discuss analytic ideals without involving those who are actually involved in applying tradecraft to real problems. It is also not enough to conduct such research in artificial environments within student populations – real line analysis is too different.

We certainly cannot agree with the apparent off-hand condemnation of “folk” psychology of intelligence analysis – clearly aimed at taking on Heuer’s “bible”. While we think that there is a clear role for the methodologists and their research into strengthen analytic tradecraft, there is also a very real need for interdisciplinary adaptation from other areas of social science, as well as the kind of internal discussions that make up a key part of the maintenance of those oft-criticized, but entirely vital, guilds that are the backbone of the community.

We do however find several key concepts of great interest that deserve wider attention, including the concept of the Supervisor’s Dilemma – the balance of customer outcomes requirements and analytic resource opportunity costs against the relative depth of analytic rigour. We also find the study techniques themselves of interest, especially the concept of elicitation through critique – something we feel will likely have a far greater applicability in capturing the kind of intergenerational knowledge that the community is in danger of losing. We see the technique as one means of making more formal – and scalable - some of the kinds of subtle interactions that characterized the experiences of apprentice and journeyman analysts under the mentorship of a master.

We also find great merit in the good professor’s comments regarding the overconfidence of new analysts, and the satisificing biases that result. We definitely have observed a level of arrogance in too many new hires – and especially those coming out of the intelligence studies programs. The first lesson that an analyst student should learn is the fear of God – and of their own error. Too many programs of instruction are not affording the student the chance to learn that fear from the visceral experiences of their own mistakes, and to take away from the experience a humility that will cause them to productively question their future work toward its improvement. Such experiences are far better gained when the consequences are not fatal, in line with the lessons taught by Red Flag.

There is much food for thought in this lecture, as well as the contributions to the literature that the Ohio State program has generated. We will no doubt have further commentary on the subject in the near future.

Labels: , , , , , , ,