A sociology of information triptych this morning. Disclosure laws that fail to fulfill their manifest/intended function, the secret work of parsing public information, and the pending capacity to record everything all bear on the question of the relationship between states and information.
In a 21 Jan 2012 NYT article, "I Disclose ... Nothing," Elisabeth Rosenthal (@nytrosenthal) suggests that despite increasing disclosure mandates we may not, in fact, be more informed. Among the obviating forces are information overload, dearth of interpretive expertise, tendency of organizations to hide behind "you were told...", formal rules provide organizations with blueprint for how to play around with technicalities (as, she notes, Republican PACs have done, using name changes and re-registration to "reset" their disclosure obligation clocks), routinization (as in the melodic litanies of side-effects in drug adverts), and the simple fact that people are not in a position to act on information even it is abundantly available and unambiguous. On the other side, the article notes that there is a whole "industry" out there -- journalists, regulators, reporters who can data mine the disclosure information even if individuals cannot take advantage.
Rachel Martin's (@rachelnpr) piece on NPR's Weekend Edition Sunday, CIA Tracks Public Information For The Private Eye describes almost the mirror image of this: how intelligence agencies are building their infrastructure for trying to find patterns in and making sense of the gadzillions of bits of public information that just sits their for all to see. It's another case that hints at an impossibility theorem about "connecting the dots" a priori.
And finally, in another NPR story, "Technological Innovations Help Dictators See All" Rachel Martin interviews John Villasenor about his paper, "Recording Everything: Digital Storage as an Enabler of Authoritarian Governments" on the idea that data storage has become so inexpensive that there is no reason for governments (they focus on authoritarian ones, but no reason to limit it) not to collect everything (even if, as the first two stories remind us, they may currently lack the capacity to do anything with it). I if surveillance uptake and data rot will prove to be competing tendencies.
The first piece suggests research questions: what are the variables that determine whether disclosure is "useful"? what features of disclosure rules generate cynical work-arounds? if "more is not always better," what is? can we better theorize the relationship between "knowing," open-ness, transparency, disclosure and democracy than we have so far?
The second piece really cries out for an essay capturing the irony of how the information pajamas get turned inside out with the spy agency trying to see what's in front of everyone (we are reminded in a perverse sort of way of Poe's "The Purloined Letter"). Perhaps we'll no longer associate going "under cover" with the CIA.
And the alarm suggested in the third piece is yet another entry under what I (and maybe others) have called the informational inversion -- when the generation, acquisition, and storage of information dominates by orders of magnitude our capacity to do anything with it.
Sunday, January 22, 2012
Sunday, January 08, 2012
Journalism and Research Again
Lots of Twitter and blog activity in response to NYT article about Chetty, Friedman, and Rockoff research paper on effects of teachers on students' lives.
No small amount of the commentary is about how when journalists pick "interesting" bits out of research reports to construct a "story" they often create big distortions in the social knowledge-base.
So what can reporters do when trying to explain the significance of new research, without getting trapped by a poorly-supported sound bite?
Sherman Dorn has an excellent post on the case, "When reporters use (s)extrapolation as sound bites," that ends with some advice:
See also Matthew Di Carlo on ShankerBlog, Bruce Baker on SchoolFinance 101, and Cedar Reiner on Cedar's Digest
No small amount of the commentary is about how when journalists pick "interesting" bits out of research reports to construct a "story" they often create big distortions in the social knowledge-base.
So what can reporters do when trying to explain the significance of new research, without getting trapped by a poorly-supported sound bite?
Sherman Dorn has an excellent post on the case, "When reporters use (s)extrapolation as sound bites," that ends with some advice:
- "If a claim could be removed from the paper without affecting the other parts, it is more likely to be a poorly-justified (s)implification/(s)extrapolation than something that connects tightly with the rest of the paper."
- "If a claim is several orders of magnitude larger than the data used for the paper (e.g., taking data on a few schools or a district to make claims about state policy or lifetime income), don’t just reprint it. Give readers a way to understand the likelihood of that claim being unjustified (s)extrapolation."
- "More generally, if a claim sounds like something from Freakonomics, hunt for a researcher who has a critical view before putting it in a story."
See also Matthew Di Carlo on ShankerBlog, Bruce Baker on SchoolFinance 101, and Cedar Reiner on Cedar's Digest
Labels:
education,
journalism,
research,
science reporting
Subscribe to:
Posts (Atom)