Epistemology is the study of what we can know and how we can know it. Unfortunately, the conversation often turns to abstract, brain-in-a-jar style questions. I say this is unfortunate, because one of the foremost problems in an era of Big Data is the very practical problem of sorting through a vast array of available information and trying to arrive at something close to objective truth. Do we do this through quantitative analysis? If so, which numbers do we use? How about qualitative analysis? culture? narrative? talking to as many people as possible? gut intuition? surveys of experts? computer models? econometric models? fashion models? And once we’ve figured out what to do (if we can) do we actually get objective truth or do we just get closer to it? Or is the whole thing hopeless with each answer being just as good as the next?
Though it wouldn’t normally be considered a text in philosophy, Nate Silver’s The Signal and the Noise does a very good job of answering these questions (and occasionally it does directly address them in philosophical terms).
Scientific American estimates that the human brain can store about 2.5 million gigabytes of information, enough to store three million hours (about 300 years) of TV shows. That’s a lot of data, but it’s only one one-thousandth of the 2.5 trillion gigabytes of information that IBM estimates is being produced each day. (Check out this really awesome graphic of Computers v. Brains).
That’s a lot of data to sort through, and it’s why Silver’s terms ‘signal’ and ‘noise’ are incredibly useful in understanding what’s going on. Human beings naturally try to find patterns, it’s just how we’re wired, and most of the time it’s an incredibly useful skill. The ‘signal’ refers to actual patterns hidden among a lot of a different data points ‘noise’ that make it very difficult to find the pattern. Unfortunately, with so much information to sort through it’s easy to mistake noise for signal and wind up with mistaken interpretations.
There are two main ways to mistake noise for signal. One is to pick up on random correlations and think they are meaningful. To give a quick example, there’s only a one in thirty-six chance of rolling two sixes in a row, but if you roll the dice enough it would actually start to be suspicious if you didn’t eventually get two sixes in a row. With so much information out there, there are bound to be a few things that look like statistically significant correlations but are actually just noise.
The other way to mistake noise for signal is to succumb to confirmation bias. Confirmation bias is our tendency to remember the information that fits our preconceived notions or doesn’t threaten us socially/economically/politically while ignoring information that contradicts our present worldview. If we already have a theory about what is going on then we can pick out the data points that support our theory and ignore the ones that don’t.
Because we can never actually completely remove ourselves from the universe we can’t be objective; we will always be looking at reality from some subjective viewpoint. However, not being able to access Objective Truth does not mean that all is lost. As Silver writes, “This [not having objective truth] does not imply that all prior beliefs are equally correct or equally valid. But I’m of the view that we can never achieve perfect objectivity, rationality, or accuracy in our beliefs. instead, we can strive to be less subjective, less irrational, and less wrong.”
The practical question for modern epistemology is how we go about being less subjective, less irrational, and less wrong. There is not one simple answer to this question. The best predictions combine quantitative data, qualitative data, and human intuition. I think this probably also the best way to make sense of the present. Quantitative data can’t tell you everything, but if your intuition and the quantitative data point in completely opposite directions, it’s probably time to rethink your intuition. Rigorous studies can help to reduce instances of confirmation bias, but they don’t eliminate it.
Other people are also important to this process. Not only because we might have done the math wrong or overlooked something, but also because other people will have a different subjective position on issues simply by virtue of being different people.
This understanding of epistemology calls for humility, but not for despair or inaction. We can know that certain viewpoints are wrong and that some beliefs are significantly more likely to be correct than others. But as in the field of prediction (which is what Silver is really studying) where very little is certain but many things are highly probable, so too in epistemology.