Tuesday, December 28, 2010

Outflanking "the Human" with Information

Two stories in NYT today about data crunching. One on mapping neuron connections in mice to understand how brains work. The other on using statistics to detect possible cheating on standardized tests.

The brain research takes thin slices of brain tissue and maps connections between neurons in a really BIG (petabyte per mm3) data mining operation. The research is in the infancy stage, but eventually one can imagine having a full circuit diagram of a brain. Interesting implications possibly grasped by either researchers or the articles author:
Neuroscientists say that a connectome could give them myriad insights about the brain’s function and prove particularly useful in the exploration of mental illness. For the first time, researchers and doctors might be able to determine how someone was wired — quite literally — and compare that picture with “regular” brains.
Experts quoted in the article debate whether the research is promising enough to spend millions on.  But this comment about defining normal or regular brains is not one of the concerns they mention.  What are the informational implications of having a data set that describes the connections of a "normal" person? 

The second article, "Cheaters Find an Adversary in Technology," reads as a shameful bit of commercial promotion masquerading as journalism, but does usefully illuminate the worldview of  the standardized test industry.  The story is about a company that uses statistics to detect cheaters.  Their algorithms are designed to detect things like similar patterns of wrong answers, changed answers, and big improvements in test scores.  If a group of students all misunderstood something in the same way it would look like cheating.  And a test taker who "saw the light" at one point and went back and changed several answers will look like a cheater.  And the thing we do most in school, attempt to teach people stuff, if successful would lead to big improvements in test scores.  But that too, according to the experts, would look like cheating.

There is an arrogance about testers (the gentleman profiled calls himself (unselfconsciously, notes the journalist) "an icon" -- (those who have never heard of him are poorly informed)) that consistently rankles.  And their self-promotion as agents of fairness and meritocracy (recall The Big Test) is simple hypocrisy.  More problematic, though, is the influence on teaching, learning, and scholarship of a regime that bases its authority and legitimacy on science and objectivity, but that shrouds itself in secrecy and lives OFF rather than FOR education.

Why these two articles together?  They suggest a sort of pincer maneuver against "the human" based in information -- on one flank, structure, define the normal brain to a (particular) giant matrix of ones and zeros, while on the other, behavior, treat statistically unusual patterns of activity as morally suspect.  "Super Crunching" may be a way of the future, but one might lament the likelihood that it is THE way of the future, crowding out or delegitimizing other forms of inquiry into the human condition.  Together, these two articles suggest the imperative of an affirmative complement to our fascination with what we CAN do with information.

Source Mentions and Allusions
  1. Ayres, Ian.  2008. Super Crunchers: Why Thinking-By-Numbers is the New Way To Be Smart
  2. Foucault, Michel. 1995 (1975). Discipline and Punish: The Birth of the Prison
  3. Gabriel, Trip. 2010. "Cheaters Find an Adversary in Technology." New York Times, December 27, 2010
  4. Swedberg, Richard. 2000. Max Weber and the idea of economic sociology
  5. Vance, Ashlee. 2010. "In Pursuit of a Mind Map, Slice by Slice." New York Times, December 27, 2010

Tuesday, December 14, 2010

Wikileaks Mirror Servers Geographic Visualization

A fascinating Google Earth visualization of the wikileaks mirror sites worldwide.

Monday, December 13, 2010

An Interesting Web Book "App"

What reminds you of what? When one reads -- or hears about -- a book, one almost unconsciously make connections -- this book is a little bit like that book. When you tell someone you are interested in some topic s/he will often say, "well, then you should have a look at ...."

I just stumbled across a web resource, http://www.librarything.com/, that implements this as a combination of a personal library catalog and a social network.  It allows you, virtually, to surf your own library and connect from books you know to books that are related to it. Users "tag" books creating a interesting way to slice through the database. Try these, for example: sociology, history, philosophy, economics. And it keeps an eye on where a given book is available -- libraries, bookstores, online digital sources, used book networks (like abebooks.com).

When I played around with it looking for books on the sociology of information I got a bookshelf that nearly mirrored my the books in front of me on my study's shelves, but with a few titles I was unfamiliar with :

Wednesday, December 08, 2010

Wikileaks Conversation Continues

Interesting piece in NYT blog "The Lede" about online activists' response to credit card companies and PayPal "blacklisting" Wikileaks. 

The entry includes the YouTube "manifesto" of the group (or, rather, decentralized network) "Anonymous" that claims to be at the center of this backlash.

The Times Blog gives a list of related posts:

Sunday, December 05, 2010

An Old Idea Wikileaks has Gotten Me Thinking About Again

I have been sitting on a thought experiment for some years now.  Well, not exactly sitting on it -- have written a bit about it and teach it in my "sociology of everyday life" class.

It starts from Simmel's observation (in "How is Society Possible?" -- a brilliant essay, BTW) that a starting point for understanding social interaction has to be the recognition that the human condition involves awareness that one can never completely know the mind of the other.  No matter how intimate the relationship, there is material held back. 

So, imagine this.  One day, god gets a funny idea.  S/he suddenly makes people's mental content available to those around them.  All the fleeting thoughts, the quick little zigs and zags our minds make (making a cake with my mom, talking with her while I washed the dishes about her mother's death, stealing wet cement from that construction site where I smoked my first cigar, Denise my "girlfriend" in seventh grade though I liked Kim better, that pad Thai tonight was tasty if a bit heavy, I can't believe I mistakenly bought 2% milk the other day -- all that between these two sentences and this report highly censored) fully audible to anyone around us. Everyone her own Ulysses.  How exactly it would work, I'm not sure -- but imagine that there's some way that the cacophony of it all would be sorted out and we'd be privy to the internal conversations of those around us (and they ours -- and both of us privy to our reactions to what we were hearing).

So, god does this for maybe 15 minutes and then shuts it down.  This would I think, have a profound effect on us.  God would be amused.  But the s/he gets another idea: before heading off to other realms, s/he announces "that was so much fun, I think I'll do it again sometime."

That, I propose, could be the end of social life as we know it.

Friday, December 03, 2010

Wikileaks and Protecting Your Sources

In the NYT, Alan Cowell wrote today about reactions among diplomats to the WikiLeaks leaks.  In the middle of the story we read:
A Chinese intellectual, who spoke in return for customary anonymity, said the disclosures had left those like him who had contact with United States diplomats “nervous” about the possibility of exposure and persecution by authorities who have already blocked access in China to the WikiLeaks Web site.
I don't want to equate journalistic secrecy with government secrecy, but I'm surprised, as I suggested in a previous post, that there's been no commentary (or at least none I've seen -- anyone have a reference?) on the irony of the secrecy and confidentiality given sources (as above) by the media vs. the ones revealed in the leaks.

NOTE: it appears that in a lot of the material that's been put online by media organizations some redaction of source information has been carried out.

Wednesday, December 01, 2010

FTC Proposes "Do Not Track" Option for Consumer Privacy

The Federal Trade Commission released a preliminary report, "Protecting Consumer Privacy in an Era of Rapid Change," for public comment today. Among other things, it did suggest the "do not track" option for web surfers. Here's the NYT article on the report.

A few weeks ago E. Wyatt and T. Vega wrote of the then forthcoming FTC report on net privacy in "Stage Set for Showdown on Online Privacy" (NYT November 9, 2010):
"Consumer advocates worry that the competing agendas of economic policy makers in the Obama administration, who want uniform international standards, and federal regulators, who are trying to balance consumer protection and commercial rights, will neglect the interests of people most affected by the privacy policies. “I hope they realize that what is good for consumers is ultimately good for business,” said Susan Grant, director of consumer protection at the Consumer Federation of America."
The report contains what look like some good, balanced, and practical guidelines for how consumers and information collecting entities interact on the web and elsewhere.

I'd like to propose, as a thought experiment, a more radical approach.  What if we started from the premise that everyone owns her own information.  You own you opinions, your attitudes, and the traces your behavior might create.  If this information is valuable to another entity, they are free to bid on it.  We don't need privacy protections, we just need an infrastructure that will allow for a market for private information to operate.

A website or a retailer can have an offer, right at the front door: if you want to browse here, I want to know your name and take note of what you look at.  The consumer, in return, can say, you can watch me, for 5 dollars.  Consumers can make money by moving around the net and generating value.  The entities who host websites on which behavior turns into information turns into value would also be entitled to a share.

Now take the idea a step further.  Suppose rather than selling my information I agree to license it.  This time I say, you can watch me for $5 but down the road, if any value accrues to you by virtue of you aggregating my information with that of others, I want a cut.  As my information goes upstream, up the aggregation pyramid, it becomes a component in something valuable: I deserve a share. 

Of course, we'll be told this is completely impractical.  Retailers and other entities would just build in the cost.  And the transaction costs would be too high.  Maybe.  But we've got micro-credits  worked out at the level of single click-throughs.  I don't think the barriers would be technical.

From Information Superhighway to Information Metrosystem

The new FTC report on consumer privacy has an interesting graphic in an appendix. It purports to be a model of the "Personal Data Ecosystem." It's interesting as an attempt to portray a four-mode network : individuals, data collectors, data brokers, and data users. The iconography here seems to be derived from classic designs of subway and underground maps.

From http://www.ftc.gov/os/2010/12/101201privacyreport.pdf.

The genre mixing in the diagram invites, on the one hand, a critical look at where the FTC is coming from in the report (which, in my limited experience of digesting FTC output looks relatively well done) and, on the other, points toward a need to better conceptualize the various components and categories.

Under "collectors," for example, we have public, internet, medical, financial and insurance, telecommunications and mobile, and retail. The next level (brokers) includes affiliates, information brokers, websites, media archives, credit bureaus, healthcare analytics, ad networks and analytics, catalog coops, and list brokers. Finally, on the info users front we have employers, banks, marketers, media, government, lawyers and private investigators, individuals, law enforcement, and product and service delivery.

It's a provocative diagram that helps to focus our attention on the conceptual complexity of "personal information" in an information economy/society. More on this to follow.

Tuesday, November 30, 2010

Leaking Irony

While I work on more extended analysis of the WikiLeaks situation (among other things the obvious connection to my work on how geometries of information sharing are co-constitutive of social relationships and statuses), a small irony must be noted.

Apparently, several news organizations have had the material recently made public since August.  Editors and reporters have been meeting in secret to develop protocols about what would be reported, when, and how.  Fortunately for their work, it appears that these journalists managed to do all of this while maintaining the kind of secrecy necessary for them to be able to process the information and to consider its meaning and its implications out of public view.  The public, media, and official reaction of the last few days make clear why this secrecy was necessary.

One thing that would be interesting to hear a story on would be what measures were taken to ensure the security of the process.  What sorts of technological tools were employed?  What sorts of social tools?  Did participants have to sign confidentiality agreements?   What prevented a rogue reporter from reporting on the reporters reporting?

Friday, November 12, 2010

Work Slowdown in Soc of Info


Have been swamped with teaching and administrative work of late, and trying to spend two days a week at the Center for Advanced Study in the Behavioral Sciences has me on a slow blogging output these days.

Am working on pieces on forms (the kind you fill out) as rationalizing filters and "information interaction protocols," the assumptions behind pending online-privacy proposals (from Commerce and FTC), the soc of info implications of the story of the CIA official knew that a Jordanian contact was a problem (a la double agent -- he later blew himself up at a remote CIA location), and what it means when we make moral judgments about people's ignorance of some thing (as in, "she didn't even know what hip-hop was!").

Meanwhile, thanks for reading. Would love to read any comments you might have -- are there there?



Sunday, October 17, 2010

Peer to Peer Education: Can Students Teach One Another?

One of society's major "information institutions" is, of course, the university (and colleges, too). In these institutions information is generated, classified, evaluated, sanctioned, organized, and systematically disseminated.

There are lots of interesting experiments going on in and around the university connected with its various fundamental information functions (e.g., opentextbook.org, wikibooks, OpenCourseWare, and, of course, all manner of distance learning). Each of these experiments plays with changing how we think about one piece of the education equation.

I've just come across one that takes the university itself out of the picture: The Peer 2 Peer University (P2PU). P2PU is structured as an online community of open study groups whose members engage one another in short university-level courses. Their model is to connect open educational resources and small groups of motivated learners. P2PU supports the endeavor with a course infrastructure that facilitates course design by an "organizer," interaction among participants, access to materials, and methods for recognition of students' and tutors' work. Initially focused on more technical skills, the organization seems very committed to making sure that P2PU is an ongoing, distributed research project on the topic of new ways to organize learning.

The video below is a bit amateurish on the production side, but gives some idea of the why and the how behind P2PU. The project also maintains a wiki that gives you a sense of how they do what they do.

Peer 2 Peer University 2010 from P2P University on Vimeo.

Monday, August 09, 2010

Do Organizations that 'Fess Up Do Better?

Geoffrey W. McCarthy, a retired chief medical officer for the V.A., wrote, in a letter to the NYT on 9 August in response to an article on radiation overdoses in medical tests about two approaches to how organizations manage information about organizational errors. He notes that the issue illustrates the contradictions between "risk management" and "patient (or passenger or client or consumer) safety."

He notes that the risk manager will say "don't disclose" and "don't apologize" because these could put the organization at legal or financial risk. A culture of safety and organizational improvement, though, would say "fully disclose," not because it will help the patient, but because it is a necessary component of organizational change. The organization has to admit the error if is going to avoid repeating it, he asserts.

This suggests a number of sociology of information connections, but we'll deal with just one here. This example points to an alternative to the conventional economic analysis of the value of information. The usual approach is to "price" the information in terms of who controls it and who could do what with it (akin to the risk manager's thinking above). But here we see a process value -- the organization itself might change if it discloses the information (independent, perhaps, of the conventional value of disclosure or non-disclosure). One could even imagine an alternative pricing scheme that says "sure, Mr. X might sue us, but by disclosing the information we are more likely to improve our systems in a manner that lets us avoid this mistake in the future (along with the risk it poses to us and the costs it might impose on society). Why pour resources into hiding the truth rather than into using the information to effect change?

One rebuttal to this says that an organization can do both, and maybe so. Another would say that this is just mathematically equivalent to what would happen in litigation (perhaps through punitive damages).

But I think that Mr. McCarthy is onto something in terms of "information behaviors." There are, I expect, a whole bunch of "internal externalities" associated with what we decide to do with information. In other places I've examined the relational implications of information behavior. This points to another family of effects: organizational. More to come on this.

Information and Educational Assessment I

In a letter to the NYT about an article on radiation overdoses, George Lantos writes:

My stroke neurologists and I have decided that if treatment does not yet depend on the results, these tests should not be done outside the context of a clinical trial, no matter how beautiful and informative the images are. At our center, we have therefore not jumped on the bandwagon of routine CT perfusion tests in the setting of acute stroke, possibly sparing our patients the complications mentioned.

This raises an important, if nearly banal, point: if you don't have an action decision that depends on a piece of information, don't spend resources (or run risks) to obtain the information.  The exception, as he suggests, is when you are doing basic science of some sort.

Now consider, for a moment, the practice of "assessment" in contemporary higher education.  An industry has built up around the idea of measuring educational outcomes in which a phenomenal amount of energy (and grief) is invested to produce information that is (1) of dubious validity and (2) does not, in general, have a well articulated relationship to decisions.

Now the folks who work in the assessment industry are all about "evidence based change," but they naively expect that they can, a priori, figure out what information will be useful for this purpose.

They fetishize the idea of "closing the loop" -- bringing assessment information to bear on curriculum decisions and practices -- but they confuse the means and the ends.  To show that we are really doing assessment we have to find a decision that can be based on the information that has been collected.  Not quite the "garbage can model of decision-making," but close.

Perhaps a better approach (and one that would demonstrate an appreciation of basic critical thinking skills) to improving higher education would be to START by identifying opportunities for making decisions about how things are done and THEN figuring out what information would allow us to make the right decision and THEN how we would best collect said information.  Such an approach would involve actually understanding both the educational process and the way educational organizations work.  My impression is that it is precisely a lack of understanding and interest in these things on the part of the assessment crowd that leads them to get the whole thing backwards.  Only time will tell whether these scientist-manqués manage to mediocritize higher education or not.

Thursday, July 29, 2010

Open Science : A Sociology of Information Topic par excellence

Conference at Berkeley this weekend on changing the way we think about scientific knowledge http://opensciencesummit.com.

From the conference website

Open Science Summit 2010: Updating the Social Contract for Science

July 29-31 International House Berkeley
Synthetic Biology, Gene Patents, Open Data, Open Access, Microfinance for Science, DIY science, DIY Biology, Alternative Funding for Science, Open Source Drugs, Patent Pools, Open Health/Medicine, Patient Advocacy for Innovation

"Ready for a rapid, radical reboot of the global innovation system for a truly free and open 21st century knowledge economy? Join us at the first Open Science Summit, an attempt to gather all stakeholders who want to liberate our scientific and technological commons to enable an new era of decentralized, distributed innovation to solve humanity's greatest challenges."

Tuesday, July 27, 2010

A Few Limits on Copyright

Until a few days ago, most of us did not know that the Digital Millenium Copyright Act of 1998 empowers/requires the Librarian of Congress to "determine whether there are any classes of works that will be subject to exemptions from the statute’s prohibition against circumvention of technology that effectively controls access to a copyrighted work" (U.S. Copyright Office 2010). But it does.

And here's what's changed as a result of James H. Billington's tri-ennial interpretation.

(1) College professors (in general) and students (in film and media studies, at least) can circumvent DVD security measures to include snippets of motion pictures into new works for the purpose of criticism or comment for educational purposes. A similar exemption exists for documentary filmmaking and noncommercial videos.

(2) You can hack programs on your phone if the purpose is to get programs you have legally obtained to work together. This is interpreted to mean you can "jailbreak" an iPhone and load non-Apple apps.

(3) You can hack programs on your phone if the purpose is connect it to a telecommunications network you are authorized to connect to. In other words, you can hack your Iphone so it works on Verizon.

(4) You can hack a video game you own if it's just for testing or fixing security flaws as long as you don't use the information you get from the process to help folks violate copyright.

(5) If you own software that's protected by a dongle and you can't use it because the dongle is broken and no replacement available then you can hack the software to get around the dongle.

(6) If you have an ebook and all existing ebook editions disable read-aloud, then you can hack it to make it read-aloud. In other words, if the copyright owner doesn't offer to sell a read-aloud enabled version then you can break the controls that prevent read-aloud on a copy you own. Note that it seems that the publisher could offer for sale a million dollar read-aloud-enabled version to get around this. Presumably, the exception won't unravel retroactively -- the question will be was the read-aloud-enabled version available on the day you hacked the control.


U.S. Copyright Office. 2010. "Statement of the Librarian of Congress Relating to Section 1201 Rulemaking."

Wortham, Jenna. 2010. "In Ruling on iPhones, Apple Loses a Bit of Its Grip," New York Times July 26.

Friday, March 12, 2010

Regulating the Supply of Law

From the "Friends and Relatives of the Department" Files...
The ways that states regulate professions is a topic of sociological interest.  The degree to which citizens have access to legal services to solve legal problems is a topic of sociological interest. As argued previously on this blog("Equality, Information and the Courts Redux," "Democracy and the Information Order," "Courts and the Information Order," "Suing for Information"), the way the courts work is a topic of sociology of information interest. In this op-ed, these issues come together in a sociologically interesting way. You may recognize the author of the piece as my sometime co-author (and wife).
-- Dan.

A case for legal aid at Wal-Mart

By Gillian Hadfield
Friday, March 12, 2010

The United States stands largely alone in advanced-market democracies in drastically restricting where and how people can get help with their legal problems. In all states, under rules created by bar associations and state supreme courts, only people with law degrees and who are admitted to the state bar can provide legal advice and services of any kind. [Read More]

Friday, February 05, 2010

Technologically Induced Social Alzheimers

David Pogue has a nice little piece called "Why We Make Home Videos" on the NYT website. It's basically a personal tale in defense of home videos, but he starts out reminding readers of something he's written about a number of times, data rot.

Data rot is the tendency for technology to evolve so fast that we are all left with lots of information stored on media for which there no longer exist a device to play it. The implication of this is that society as a whole "has" lots of information that it might have no way of accessing. Hence the title of this post. Of course the ironic thing is that the social problem is hardware outstripping the memory while in the personal case its sort of the hardware failing the memory.

But it points to an interesting idea: perhaps the explosion of information -- and our general capacity to store, move, and process it -- comes with some self limiting counter tendencies. One is complexity -- too much information, no one has the synoptic view or cleverness to understand what it means. Another is the connect the dots problem I've written about here. Yet another is data rot -- backwards compatibility always has its limits.  I wonder if anyone has sat down to map out what sorts of information are likely to move into the darkness of rot when.  Are all the data on punch cards gone from the social memory yet?  How about all those 24 inch fixed disk platters we used to get mounted on our System/370 machine?  I know my college thesis on it's 8 inch IBM Series/1 diskette is basically lost to time.  What else?

Related Posts
"The More Information the Better"?
What Society Knows

Sunday, January 31, 2010

Three Kinds of Information Sensitivity

OK, a naive meditation on three modes of paying attention to the world.

Pretend you are a politician, perhaps a senator or member of congress. What do you pay attention to?

Some would have us believe that poll numbers are the most important. You open your mouth, emit a sound bite, the media disseminates it, people react and respond to polls, and you adjust accordingly. Depending on your point of view, that's either democracy in action or appalling pandering. In either case, the opinions/reactions of "the people" are aggregated via some presumably reliable and accurate method.

Another theory would be that you are listening to powerful interests who have your ear and who donate to your campaign. Your comments are probably a little more proactive than reactive -- they've let you know what they want to hear and so you make sure you say it. But as above the whole thing is a cycle -- we get the initial attention by saying things and then it cycles from there. In this case, though, the method for aggregating the reactions (and pre-actions) of donors is harder to suss out. Tally up the dollars? Is there a pecking order? Or a "one topic each" rule?

A third approach would be that you apply accepted methods of policy analysis and make use of trustworthy data to decide what policies would best achieve desired aims. Here information is aggregated and decisions made using generally accepted (and open) methods. Of course, deciding on those aims is an information problem that can bring us right back into one of the first two approaches, but we'll set that aside for the moment.

My guess is that a system COULD run on any of these three approaches to information processing. What presents a challenge to govern-ability, though, is when one or more of these is the public face of what's going on while another one is what's going on behind the scenes. Or, worse, when the actors themselves don't really have a handle on when they are using one or another to try to ascertain how to govern.

And yes, this could be seen as an attempt to translate direct democracy, some variation on aristocratic pluralist democracy, and technocracy (help me on the terms, polisci friends) into information terms.

Tuesday, January 19, 2010

New Digital News Outlet from KALW

This week KALW is launching its new local digital magazine to complement their broadcast work.  The new site has a way for community leaders to plug in and help them do a better job of reporting on the arts and other community events and issues.  Users can become "community correspondents".  Check it out, help them tell others about it and together we can do a better job of becoming the media we want to create.

Here's the magazine: http://www.kalwnews.org/

This is the community page: http://www.kalwnews.org/community

And here's their FB group to stay in touch: http://www.facebook.com/pages/KALW-News/195280839624

Friday, January 08, 2010

Great Info Blog and Interesting Sounding Conference (NY Feb)

Check out Graham Webster's excellent blog for some insightful essays on topics not unrelated to those I'm writing about here. It's called:infopolitics/

While reading through that blog, came across mention of an interesting conference to be held in February at the New School in NYC: Conference on Information Flow Restrictions at the New School. It actually had me looking at plane ticket prices and thinking "whom do I know that I could stay with...." If you know how little I like to travel during the semester, you get the idea that I was intrigued.

Wednesday, January 06, 2010

Those damn unconnected dots again (rough draft)

An article in the Times, under the headline "Obama Says Plot Could Have Been Disrupted," reprises the metaphor of "connecting the dots" to describe different pieces of information having been in different heads, but never getting put together in one head that could make sense of them.

It is reassuring that Obama's speaking bluntly about organizational performance rather than riding roughshod over the constitution, but, as argued in an earlier piece ("Mind the Gap"), the idea that it's a simple problem of dot connecting is a basic misconception.

How do you hear "connect the dots"?  One version is reminiscent of a detective show or Agatha Christie novel; the challenge is to assemble hints -- pieces of information that, alone, are not conclusive proof of anything -- in such a way that the "answer" emerges as a sort of logical necessity.  The "logic" is in the mind of the beholder, but that's all.

A different version is reminiscent of the we draw lines between stars and come up with "constellations."  Two things are important.  One, the stars are not really next to one another -- the viewer is the one who sees them as points on a plane and interpolates and extrapolates the other vertices of the figure.  Two, there's no there there -- the crab in cancer or the warrior in Orion has to be brought to the observation by us.

The first requires us to have all the pieces on the table and be open to what they "tell us" when seen together.  The challenge for intelligence agencies is to put the information from various sources onto the same table.

The second requires us to decide what to pay attention to and what to ignore (left), how to connect and not connect (middle), and what to add that's not there (right).

If we increase the degree of information sharing we fill up our field of view with more and more points and the dots get harder and harder to connect.

On the other hand, if we ask the different agencies to filter the information then we are back in hot water because none of them know what they are looking for.

The president was furious about the failure of the system to see "the red flags" and intelligence agencies are reported to have said that the information they had was "vague but available."  The problem is that flags are not, in general, a priori red.  Presumably, some smart people are thinking about how systems see and things like that; hopefully, they don't just think of it as "connect the dots."

We observe with some irony that the actual policy response to the problem -- at least the response that's been announced -- is in fact to gather more information via increased screening.

Oh, and if we look up "connect the dots" in Wikipedia you get a short article about a children's game. It bears a Wiki-warning: "This article may require cleanup to meet Wikipedia's quality standards."