Analytical problem definition
We have been greatly surprised as we have observed so many new analysts, trained in the methodologies of structured analysis, struggle to choose among multiple techniques when presented with a new problem (even in target accounts where there has been long established precedent, if only they did their research “homework). The trend has been no less ominous in many of the academic intelligence instructors were have witnessed, where they may emphasize specific analytic methodologies to the exclusion of all others – even for problems where the favoured technique would be less effective than other techniques, or perhaps even entirely ill-suited. This worrying issue, however, is usually not present at the larger service schools or major community institutions, although both the outside “civilian” academics and the community schoolhouses make use of essentially the same literature in much the same way.
In large part we have begun to consider that many of these new analysts (and the academic instructor staff which guides them) may simply be failing to adequately structure and define analytical problems in the context of the larger account. It is not an entirely easy task, and one in which most tradecraft classes spends some good measure of time on. There are many means by which an analyst may refine a question or question set, which are widely available in open literature. If given the opportunity to interact with potential consumers (or rarely, the supported policymakers – a subtle distinction in an age of where flourishing bureaucracy continually adds intervening layers), there are a wide range of techniques to ensure the problem as defined and presented will actually serve decision maker expectations. (Many of these have been borrowed wholesale from the consulting world, in another example of the positive cycle of influences from privatized intelligence providers.)
We have initially thought that this lack of use of analysis techniques other than the usual ACH (or much more rarely, linchpin or Bayesian methods) stemmed from a lack of familiarity with other options – like the man whose only tool was a hammer, so to speak. But we are beginning to believe that many of the individuals we have observed display the excessive singularity (one might even say obsessively dogmatic) emphasis on single technique solutions may simply be seeing every problem as a nail despite an otherwise full toolbox – and insisting making a racket until others agree or walk away.
We have witnessed no more clearly drawn example of this dynamic than a recent academic effort to create an unclassified, open source futures study of a transnational intelligence issue with a 15 year estimative horizon. The effort began well enough, but somewhere in the necessary negotiations regarding production management and workflow assignment, an entirely unusual emphasis on certainty began to creep (and then surge) into the requirements set.
Now, as any analyst should know, we are not issued crystal balls when we are given our badges – it is not given to us to see the future. We merely may sketch its influences and chart its drivers, and by virtue of reasoning and critical thinking draw conclusions regarding the interactions of those elements. The further out we would cast our gaze, the less certainty any analyst can lay claim to. Yet this study team, in an effort to make their matrices tidy and their predictions expressed free from any sort of ambiguous language at all, ignored this maxim in order to produce a piece in which sourcing concerns over-rode any forward looking projections and in which anything not facts unassailably expressed were simply ignored. In short, it became not an estimate at all, but rather descriptive “intelligence” summarizing a global issue in a neatly ranked format any Taylorite industrial age central planner could love.
This clearly drew the wrong lesson from the debate and criticism surrounding recent estimative processes, especially regarding the communication of analytical confidence. But we believe that this resulted largely due to the fact that the group that was exposed to a small number of data points regarding evolving community practices (whether or not more material may have been available to them - which we are certain it was), and sought to re-make core tradecraft in an idealized image observed through an imperfect mirror. Had the group spent less effort congratulating themselves on the “newness” of their approach, and arguing measures which substituted scope and complexity for rigour, they might have perhaps focused on core problem definition and tested a wider range of existing analytic tradecraft before embarking a course which produced not only marginal results, but an entirely unrecognizable pathway.
Lest one think we are too cruel in judging this effort, let us hasten to add that one expects such occasional stumbling from journeymen analysts. Mistakes are the best means by which lessons are learned, especially when one has the luxury of making them in the schoolhouse before doing so in a context which might cost lives. (This is after all the enduring principal of Red Flag, which applies equally to intellectual as well as operational assignments.) But we cite this small, no doubt otherwise soon to be forgotten effort not as judgment on individuals but rather as an example of a systematic breakdown of the means by which the craft is being transmitted. While one example does not a trend make, there are dozens more similar stories (which vary in detail, but not kind) that could be told. And it appears to be a far more subtle failure than we might otherwise have thought, capable of occurring despite the best available literature - but with the wrong sort of emphasis.
Perhaps this is symptomatic of the failures of strategic perspective that also plagues many favoured institutions, or perhaps this is part of the range of things that the community has failed to adequately inculcate in those that are now engaged in spreading its doctrines and craft.
(Admittedly, it is hard to convey a robust sampling of the community’s knowledge to those instructors that spent only a few short years as practitioners, if at all.) If nothing else, this seems to indicate a need for a return to first principals, and a deep and reflective examination before Smoking Mirror.
In large part we have begun to consider that many of these new analysts (and the academic instructor staff which guides them) may simply be failing to adequately structure and define analytical problems in the context of the larger account. It is not an entirely easy task, and one in which most tradecraft classes spends some good measure of time on. There are many means by which an analyst may refine a question or question set, which are widely available in open literature. If given the opportunity to interact with potential consumers (or rarely, the supported policymakers – a subtle distinction in an age of where flourishing bureaucracy continually adds intervening layers), there are a wide range of techniques to ensure the problem as defined and presented will actually serve decision maker expectations. (Many of these have been borrowed wholesale from the consulting world, in another example of the positive cycle of influences from privatized intelligence providers.)
We have initially thought that this lack of use of analysis techniques other than the usual ACH (or much more rarely, linchpin or Bayesian methods) stemmed from a lack of familiarity with other options – like the man whose only tool was a hammer, so to speak. But we are beginning to believe that many of the individuals we have observed display the excessive singularity (one might even say obsessively dogmatic) emphasis on single technique solutions may simply be seeing every problem as a nail despite an otherwise full toolbox – and insisting making a racket until others agree or walk away.
We have witnessed no more clearly drawn example of this dynamic than a recent academic effort to create an unclassified, open source futures study of a transnational intelligence issue with a 15 year estimative horizon. The effort began well enough, but somewhere in the necessary negotiations regarding production management and workflow assignment, an entirely unusual emphasis on certainty began to creep (and then surge) into the requirements set.
Now, as any analyst should know, we are not issued crystal balls when we are given our badges – it is not given to us to see the future. We merely may sketch its influences and chart its drivers, and by virtue of reasoning and critical thinking draw conclusions regarding the interactions of those elements. The further out we would cast our gaze, the less certainty any analyst can lay claim to. Yet this study team, in an effort to make their matrices tidy and their predictions expressed free from any sort of ambiguous language at all, ignored this maxim in order to produce a piece in which sourcing concerns over-rode any forward looking projections and in which anything not facts unassailably expressed were simply ignored. In short, it became not an estimate at all, but rather descriptive “intelligence” summarizing a global issue in a neatly ranked format any Taylorite industrial age central planner could love.
This clearly drew the wrong lesson from the debate and criticism surrounding recent estimative processes, especially regarding the communication of analytical confidence. But we believe that this resulted largely due to the fact that the group that was exposed to a small number of data points regarding evolving community practices (whether or not more material may have been available to them - which we are certain it was), and sought to re-make core tradecraft in an idealized image observed through an imperfect mirror. Had the group spent less effort congratulating themselves on the “newness” of their approach, and arguing measures which substituted scope and complexity for rigour, they might have perhaps focused on core problem definition and tested a wider range of existing analytic tradecraft before embarking a course which produced not only marginal results, but an entirely unrecognizable pathway.
Lest one think we are too cruel in judging this effort, let us hasten to add that one expects such occasional stumbling from journeymen analysts. Mistakes are the best means by which lessons are learned, especially when one has the luxury of making them in the schoolhouse before doing so in a context which might cost lives. (This is after all the enduring principal of Red Flag, which applies equally to intellectual as well as operational assignments.) But we cite this small, no doubt otherwise soon to be forgotten effort not as judgment on individuals but rather as an example of a systematic breakdown of the means by which the craft is being transmitted. While one example does not a trend make, there are dozens more similar stories (which vary in detail, but not kind) that could be told. And it appears to be a far more subtle failure than we might otherwise have thought, capable of occurring despite the best available literature - but with the wrong sort of emphasis.
Perhaps this is symptomatic of the failures of strategic perspective that also plagues many favoured institutions, or perhaps this is part of the range of things that the community has failed to adequately inculcate in those that are now engaged in spreading its doctrines and craft.
(Admittedly, it is hard to convey a robust sampling of the community’s knowledge to those instructors that spent only a few short years as practitioners, if at all.) If nothing else, this seems to indicate a need for a return to first principals, and a deep and reflective examination before Smoking Mirror.
Labels: analytic tradecraft, strategic thinking, teaching intelligence
<< Home