“...there are known-knowns; there are things we know we know. We also know there are known-unknowns; that is to say we know there are some things we do not know. But there are also unknown-unknowns—the ones we don't know we don't know.” - Donald Rumsfeld
When we approach a scientific topic, be it climate science, cancer research, crop yields, or nuclear physics, we typically expect two distinct things. First, we expect the scientists in the field to have answers to our questions. What’s the weather going to be tomorrow? How do I stop my tomato plants from dying? Can we use nuclear energy to power our city? Second, we expect individual scientists to expand the general body of scientific knowledge by attempting to understand the unknown. We fund research hoping to shine a light in the many dark corners of the natural world that are not yet understood. Can we create better weather forecasts? Can we improve on current methods of tomato gardening/farming? Can we discover a safer way to use nuclear energy?
These two expectations, to know and to find, are examples of known-knowns and known-unknowns. We expect scientists to have a body of knowledge (known-knowns) as well as be capable of pushing the boundaries to expand this body of knowledge (known-unknowns). We look to science textbooks for answers, and fund research grants with clear expectations of improving our world. The unknown-unknowns are pesky and troublesome. Since we don’t know what they are, it’s difficult to make a plan to discover them. Often, we rely on luck to uncover these unknown-unknowns (I’m going to talk about unknown-unknowns in a future post).
However, science (in general) and scientists (in particular) are often approached with another question that the tools and methods of science are not designed to address: what should we do? What should we do to address climate change? What should we do to reverse the trend in cancer rates? What should we do about nuclear energy, tar sands, wind energy, habitat loss, ocean acidification, and infant mortality?
I want to point out here that I am talking about pure science. Applied science and engineering have real-world, practical components with budget, political, or regulatory limitations and expectations that direct "what should be done." But even a pure scientist, who is solely concerned with scientific discovery, does not always wear a pure science hat. Every scientist is also a citizen, with a personal belief or value system that is almost always partially independent of their scientific training.
This “should” question is the question of our time, and science, on its own, is utterly unable to address it. The following conceptual diagram, which traces back to Plato (I’ve changed Plato’s ‘values’ to beliefs) helps us understand why:
This figure makes a clear distinction between what is true and what we believe. By and large, most people strive to merge the two spheres, but often fail. Beliefs are not always true. For instance, nearly 90% of Americans who drive a car believe that they are better than average drivers [1]. By definition, this cannot be true; the average American is a better driver than 50% of Americans and a worse driver than 50% of Americans.
There may also be truths that we don’t believe. For instance, many of us are afraid of sharks and believe them to be really dangerous. However, more people are killed by horses, cows, bees, and deer than sharks in a given year [2]. This true fact, however, is very unlikely to alter many people's beliefs about the dangerous nature of sharks.
Where does science fit into this? Science, which is concerned exclusively with the observable world, is able to speak only to the truth circle. In addition, science is certain of some things and uncertain about others. We are certain (in the center of the truth circle) that CO2 is made up of an atom of carbon and two atoms of oxygen. We are highly certain (towards the center of the truth circle) that increased concentrations of CO2 in the Earth’s atmosphere increase the global temperature. We are uncertain (towards the boundary of the truth circle) exactly how increased CO2 concentrations will influence the atmosphere/ocean/ice/land/biological system.
And, according to Plato and the above diagram, knowledge, wisdom, and decisions for action exist at the overlap between truth and belief. For instance, the question “how much CO2 should we emit?” depends both on the truth (via science) and our beliefs about how much is acceptable (via a multitude of sources). Or the question “should I eat meat?” depends on the current scientific understanding of the state of meat production in the modern world and each person’s individual belief in what is right [3].
This is an important distinction because people frequently advocate a course of action and cite science, as if that is all that was needed to make a decision. What people often leave out when advocating a course of action is a statement about their particular belief system. There are numerous examples: climate change, renewable energy, hydrofracking, meat production, vaccines, abortion, and so on.
Much of the bitter debate, anger, and confusion related to these issues comes from this basic misunderstanding. When someone says, “we should do [this action] because of [this scientific conclusion]!”, what they are really saying is “we should do [this action] because of [this scientific conclusion] and because I believe in [this value system]!” A scientific conclusion cannot on its own address questions of “should.”
Future attractions:
I’ve written a lot about issues surrounding actual climate science, so for the following posts I will review many of the “known-knowns” of climate science. After that I will address some of the known-unknowns, and then I’ll discuss some of the potential unknown-unknowns.
[1]: http://en.wikipedia.org/wiki/Illusory_superiority#Driving_ability
[2]: http://www.geog.ucsb.edu/events/department-news/1195/forget-sharks-cows-are-more-likely-to-kill-you/
[3]: Personally, I love meat, but I strive to make sure it is locally sourced and humanely treated. That’s not a scientific evaluation but a belief/value that I hold.