"Drive-by regression" is my phrase - I think an original coinage - for describing what economists (or statisticians or physicists) do when they pick some other field, grab some convenient data, take it out of its context and perform some statistical analysis on it, preferably finding some kind of counter-intuitive result, and then depart, leaving the locals to deal with the resulting mess.
Freakonomics is perhaps the highest-profile example (or in a similar vein something like The Logic of Life) but there are plenty of others: physicists or statisticians shooting-up historical linguistics, for example. Anyone got any favourites?
From that wonderful book: The Drunkard's Walk, by Leonard Mlodinow; this passage about the O.J. Simpson trial:
"The prosecution made a decision to focus the opening of its case on O. J.'s propensity toward violence against Nicole. Prosecutors spent the first ten days of the trial entering evidence of his history of abusing her and claimed that this alone was a good reason to suspect him of her murder. As they put it, "a slap is a prelude to homicide." [14] The defense attorneys used this strategy as a launchpad for their accusations of duplicity, arguing that the prosecution had spent two weeks trying to mislead the jury and that the evidence that O. J. had battered Nicole on previous occasions meant nothing. Here is Dershowitz's reasoning: 4 million women are battered annually by husbands and boyfriends in the United States, yet in 1992, according to the FBI Uniform Crime Reports, a total of 1,432, or 1 in 2,500, were killed by their husbands or boyfriends. [15] Therefore, the defense retorted, few men who slap or beat their domestic partners go on to murder them. True? Yes. Convincing? Yes. Relevant? No. The relevant number is not the probability that a man who batters his wife will go on to kill her (1 in 2,500) but rather the probability that a battered wife who was murdered was murdered by her abuser. According to the Uniform Crime Reports for the United States and Its Possessions in 1993, the probability Dershowitz (or the prosecution) should have reported was this one: of all the battered women murdered in the United States in 1993, some 90 percent were killed by their abuser. That statistic was not mentioned at the trial."
..."Dershowitz may have felt justified in misleading the jury because, in his words, "the courtroom oath — 'to tell the truth, the whole truth and nothing but the truth' —is applicable only to witnesses. Defense attorneys, prosecutors, and judges don't take this oath . . . indeed, it is fair to say the American justice system is built on a foundation of not telling the whole truth." [16]"
The book is about use and misuse of statistics in everyday life. Very well worth reading.
It confirms one thing; statistics is very malleable.
Mine is marketing mix modeling taken to extremes.
My understanding of statistics as a branch of mathematics is limited to simple Venn diagram, pie chart, bar graph, histogram, in fact things elementary.
In maths, I often see the word "elegant" bounced around. A solution is not merely a solution. It is often described as elegant. Does the same term apply to stats? Do you solve a stats. poser with a solution rated elegant? Or does it belong to a separate category where a solution is just a solution.
I usually think of "elegance" as a judgement of pure mathematics (which includes abstract statistics), but statistical applications - the formulation of a clever natural experiment, say - could perhaps be considered elegant as well.
Elegance in a mathematical proof is a virtue (though an in-elegant proof might by taking a novel approach be more productive of further mathematics). Elegance in the formulation of a statistical analysis, on the other hand, can be the result of abstracting away complexities that really should have been incorporated into one's model.
An elegant mathematical proof always remains a proof; an elegant application of mathematics could turn out to be completely wrong-headed.
I'm somewhat bemused by the reverence given to Occam's Razor. At some time it may have cut the crap in ponderous philosophical discussions but it's a likely recipe for stupidity in today's science.
The universe is apparently quite happy chugging along in it's own way - massively big, flatly mindless, almost infinitely complex, accountable to no one, and, unconceptualised. We require it to fit into little natural laws that must be restricted to a handful of variables so we can hold them in our brains. To make matters worse, these theories must fulfil a further requisite of being sexy enough to get us noticed.
This all works up to a point but sooner or latter we need tackle problems that are too big for the strategy - problems like education or climate or economics - which we then compulsively dumb down into glib simplifications, fairy stories and morality plays. Obviously this craziness can't go on forever, but on the other hand, it's hard to see where enough humility to desist might come from.
Drive-by regression is a brilliant coinage! Govt. funded Education and Development Economics abound in this sort of gang violence.
Rossis 'metallic laws' predict that the long run, properly measured, impact of any policy instrument is zero but I think this is only true where the 'interessement mechanism'- i.e. politicking for the implementation of the new policy prescription or problematization- is not funded by the directly affected parties.
So, if- for example- kids funded Curriculum reform, or very poor people funded Micro finance research, or the actual long term unemployed funded Manpower policy- then you wouldn't get the mischief, the 'availability cascades', the 'preference falsification' (Timur Kuran) which is the crack-cocaine within the Econometric ghetto fueling your 'drive-by regression'.