Sætnan, Ann Rudinow (2015). The Haystack Fallacy – or Why Big Data Provides Little Security; 3rd International LAVITS Symposium: Surveillance, Tecnopolitics, Territories; 13th – 15th May 2015. Rio de Janeiro, Brazil.
This is the abstract of a critique of the rationale behind the Big Data solutions that currently is being nurtured within the fields of marketing and of computer science. Examples from the analysis of transaction data gleaned from loyalty cards ("Google Flu Trends" and "Target Pregnancy Diagnosis") have entered the canon of university textbooks where they play a role reminiscent of origin myths. We are told that security forces need to collect all our metadata if they are to keep us safe from terrorism, from organized crime, from child abuse and child pornography, etc. And computational service providers seem eager to promise that, with enough data and once they develop the right algorithms, they will be able to predict and prevent crimes.
Chris Anderson's 2008 article in Wired ("The End of Theory: The Data Deluge Makes the Scientific Method Obsolete", Wired 16.07) - somewhat ironically - serves as a theoretical justification for developing new algorithmic tools for pattern recognition and applying these far beyond the field of marketing, for instance in the natural sciences. Anderson offers four arguments as to why Big Data have (or should have) rendered established scientific methods in statistics obsolete. Each of these arguments are examined and the fallacy behind them explained.