“There are two ways to be fooled. One is to believe what isn’t true; the other is to refuse to believe what is true.” ~ Søren Kierkegaard
Back in 2010, I wrote a short post about some of the problems associated with getting all of your news and information from biased sources. It was essentially a call for people to hone their critical thinking skills and take steps toward establishing a more reality-based approach to decision-making.
Unfortunately, people don’t like challenging their existing beliefs very much because it can be pretty uncomfortable. They prefer sources of information that support their established worldviews and generally ignore or filter out those that don’t. In our modern society, this confirmation bias supports an entire ecosystem of publishers, news outlets, TV shows, bloggers, and radio announcers designed to serve up pre-filtered opinion disguised as fact.
For many people, the glossy veneer of the news entertainment complex is all they want or need. As David McRaney so succinctly states in his blog:
Whether or not pundits are telling the truth, or vetting their opinions, or thoroughly researching their topics is all beside the point. You watch them not for information, but for confirmation.
The problem with this approach is that — every now and then — fantasy runs into cold, hard reality and gets the sh*t kicked out of it.
This was what happened during the 2012 Presidential election cycle. Talking heads on both ends of the political spectrum had spent months trying to sway their audiences with confident declarations of victory and vicious denials of opposing statements. By the week of the election, the conservative media in particular had created such a self-reinforcing bubble of polls and opinions that any hints of trouble were shouted down and ignored. Pundits reserved particularly strong venom for statistician Nate Silver, whose FiveThirtyEight blog in the New York Times had upped the chances of an Obama win to a seemingly outrageous 91.4% the Monday before the election.
The furor reached its peak with Karl Rove’s famous on-air exchange with FOX news anchor, Megyn Kelly, and rippled through the conservative echo chamber after the polls closed. There was a lot of soul searching over the next few days, with many people taking direct aim at the conservative media for its failure to present accurate information to its audience. This frustration was summed up clearly by one commenter on RedState, a right-leaning blog:
“I can accept that my news is not really ‘news’ like news in Cronkite’s day, but a conservative take on the news. But it’s unacceptable that Rasmussen appears to have distinguished themselves from everyone else in their quest to shade the numbers to appease us, the base. I didn’t even look at other polls, to tell the truth, trusting that their methodology was more sound because it jived with what I was hearing on Fox and with people I talked to. It pains me to say this, but next time I want a dose of hard truth, I’m looking to Nate Silver, even if I don’t like the results.”
It was a teachable moment and Nate Silver — no fan of pundits — suggested that the fatal flaw in the approach taken by most of these political “experts” was that they based their forecasts less on evidence and more on a strong underlying ideology. Their core beliefs — “ideological priors” as Silver calls them — colored their views on everything and made it difficult to read such an uncertain situation correctly. It was time for something new.
In his book, The Signal and the Noise, Silver elaborates on the work of Philip Tetlock, who found that people with certain character traits typically made more accurate predictions than those without these traits. Tetlock identified these two different cognitive styles as either “fox” (someone who considers many approaches to a problem) or “hedgehog” (someone who believes in one Big Idea). There has been much debate about which one represents the best approach to forecasting but Tetlock’s research clearly favors the fox.
Tetlock’s ideas as summarized by Silver:
|Multidisciplinary – Incorporates ideas from a range of disciplines
||Specialised – Often dedicated themselves to one or two big problems & are sceptical of outsiders
|Adaptable – Try several approaches in parallel, or find a new one if things aren’t working
||Unshakable – New data is used to refine an original model
|Self-critical – Willing to accept mistakes and adapt or even replace a model based on new data
||Stubborn – Mistakes are blamed on poor luck
|Tolerant of complexity – Accept the world is complex, and that certain things cannot be reduced to a null hypothesis
||Order seeking – Once patterns are detected, assume relationships are relatively uniform
|Cautious – Predictions are probabilistic, and qualified
||Confident – Rarely change or hedge their position
|Empirical – Observable data is always preferred over theory or anecdote
||Ideological – Approach to predictive problems fits within a similar view of the wider world
Nate Silver also prefers the fox-like approach to analysis and even chose a fox logo for the relaunch of his FiveThirtyEight blog. As befitting a fox’s multidisciplinary approach to problems, his manifesto for the site involves blending good old-fashioned journalism skills with statistical analysis, computer programming, and data visualization. (It is essentially a combination of everything we’ve been saying about data science + data-literate reporting.)
Nate Silver’s Four-Step Methodology for Data Journalism
This approach is very similar to the standard data science process.
- Data Collection – Performing interviews, research, first-person observation, polls, experiments, or data scraping
- Organization – Developing a storyline, running descriptive statistics, placing data in a relational database, or building a data visualization.
- Explanation – Performing traditional analysis or running statistical tests to look for relationships in the data.
- Generalization – Verifying hypotheses through predictions or repeated experiments.
Like data science, data journalism involves finding meaningful insights from a vast sea of information. And like data science, one of the biggest challenges to data-driven journalism is convincing people to actually listen to what the data is telling them. After FiveThirtyEight posted its prediction of a possible change in control of the Senate in 2014, Democrats have reacted with the same bluster as Republicans did back in 2012. At about the same time, economist Paul Krugman started a feud with Silver over — in my view — relatively minor journalistic differences. Meanwhile, conservatives gleeful at this apparent Leftie infighting continue to predict Silver’s ultimate failure because they still believe that politics is more art than science.
This seems to be a fundamental misunderstanding of what Silver and others like him are trying to do. Rather than look at how successful Silver’s forecasting methodology has been at predicting political results, most people seem to be treating him as just another pundit who has joined the political game. Lost in all of the fuss is his attempt to bring a little more scientific rigor to an arena that is dominated by people who generally operate on intuition and gut instinct. I’m certainly not trying to elevate statisticians and data journalists to god-like status here but it is my hope that people will start to recognize the value of unbiased evaluation and include it as one of their tools for gathering information. When it’s fantasy vs. reality, it is always better to be armed with the facts.