Slatest passes along a story, "Military Wasn't Told of Fort Hood Shooter's E-Mails," that appeared in today's Wall Street Journal as "Agencies See Gaps in Sharing."
The Slatest post is a great example of saying "OHMYGOD...actually might not be much of a story here," the WSJ article is a little less so. Both make out the story to be "we spent millions after 9/11 to improve information sharing but here's a clear case where information wasn't shared just like in olden days" (with the implication (though the articles admit this is not a sure thing) that things might have turned out differently if information had been shared).
Some sociology of information fundamentals at work here. The WSJ article actually describes what sounds like a pretty thorough process of assessing whether or not to pass along information. For what sound like good reasons, the decision was not to. Now, maybe they need to revisit the structure of their decision process (and this is not necessarily true as no one appears to have shown that the information would have made a difference), but that's different from the story being that agencies are not sharing information.
A senate official is quoted saying "[a]ll signs are indicating that something wasn't put together." But this might be misleading. The article uses a favorite phrase from 2001, "connecting the dots," and speaks of "intelligence gaps." I think both of these are uttered too glibly and unanalytically. These sorts of events bring out massive displays of "hindsight bias" -- the tendency to see things after the fact as a lot more predictable than they really were.
There is probably also a problem with the geometric metaphor of intelligence gaps. We might be able to distinguish topological gaps (information in one place does not reach another place) from topographical gaps (the empty spots in information that any particular knower has access to). When an event like Fort Hood occurs, we start to fantasize about a world in which the gaps pointed to by hindsight would have been bridged over. But to guarantee that we probably need to posit a world in which there are no gaps, but that's a world in which everyone knows everything and without a division of informational labor, the whole thing grinds to a halt.
We need to zero in on how humans share relevant information with those for whom the information is relevant. Competent nodes in an information network have good working models of the relevance systems of the nodes they are connected with. We don't want to eliminate intelligence gaps, we want to make the gaps (read links) more intelligent. And that probably comes most from interaction. And that's something that organizations and agencies are not naturally prone to. What the analysts should look at is what we've spent the millions of dollars on in our quest to fix the intelligence gaps rather than just implying that the effort has been wasted.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment