The Dirty Secret of Science

3d rendering of dart arrows missing the target

There is an interesting article in Wired magazine titled The Neuroscience of Screwing Up. The main lesson from the article is that humans innately ignore inputs that contradict or don’t fit within their world views. As such, we need other around us to look at that same phenomena or data, ask questions and poke holes in our illusions and misconceptions. We need help of others to keep us from screwing up.

The neuroscience part of our article refers to several behaviors and parts of our brain that are designed to help us ignore inputs that don’t fit with our expectations. The dorsolateral prefrontal cortex, or DLPFC, is particularly troublesome. This part of our brain edits out the bits of reality that don’t fit with our preconceptions. We have all witnessed people seeing but not seeing an unpleasant piece of reality, and it seems even the well-honed minds of scientists do this when faced with unexpected results of experiments.

When an experiment fails, we are often disappointed but we should place more value on the unexpected result of experimentation. Some fortunate scientists discovered penicillin, x-rays, even cosmic microwave background radiation quite by accident or as a byproduct of other experiments. It makes one wonder just how many world-changing discoveries have been overlooked, ignored or buried by our dorsolateral prefrontal cortex.

Kevin Dunbar, Director of the Laboratory for Complex Thinking and Reasoning gives a memorable quote:

“Experiments rarely tell us what we expect. That’s the dirty secret of science.”

So it’s all the more important to listen to what they are telling us. Yet our brains, left to their own devices, are efficient at snipping out the bits that don’t fit in. The article reveals that the breakthroughs were not achieved as a result of brilliant individual researchers working isolation but instead

“questions asked during a group session frequently triggered breakthroughs, as scientists were forced to reconsider data they’d previously ignored”

The most productive lab meetings were those that included people from different backgrounds. This forced the experts to drop some of their jargon and resort to metaphors in order to understand each other. A team of researchers deeply lost within the technical details may become unable to look at the problem in a new way. Or faced with an unexpected result, they may be unable to understand its significance. Thanks in part to our DLPFC we are too easily able to ignore information that does not support our personal worldview or experimental framework.

To anyone who has been on a kaizen event with a cross-functional kaizen team this is not a surprising finding. The first of 10 commandments of kaizen is “abandon fixed ideas” so that we can all be open to the facts and observed experimental results, rather than our preconceived notions. The more we let go of our preconceived notions, the less of a frame of reference the dorsolateral prefrontal cortex has to work with in editing out the bits of reality that don’t fit it. The more we can work with the facts, the less the chance we’ll screw it up.

This all reminds me of some words from American economist John Kenneth Galbraith,

“Faced with the choice between changing one’s mind and proving that there is no need to do so, almost everybody gets busy on the proof.”

When conducting experiments either for pure research or to make improvements in your processes and systems, form a cross-functional team, leave your preconceptions at the door, look at the facts no matter how unexpected, and beware the DLPFC.


  1. Chris

    January 18, 2010 - 5:35 am

    This type of “science” is ridiculous. How can we know the scientists weren’t subjected to their own brain malfunction that they have not yet been able to identify because of a brain malfunction?

  2. JDSS

    January 18, 2010 - 9:34 am

    Just goes to show we can learn more from mistakes, (including others mistakes), than from doing things right…

  3. Sean

    January 18, 2010 - 11:30 am

    It’s easy to shoot holes in any brain science study.
    When we focus on the fact that the study isn’t perfect, and we know that no study is perfect, especially if humans are involved, then aren’t we playing the same game that Galbraith mentions? Is the conclusion that people prefer evidence that supports their views really that surprising?
    Can we still learn something from this “ridiculous” study?
    1) questioning pre-conceived notions may be the path to improved performance
    2) even scientists can succumb to confirmation bias

  4. Rich Poliak

    January 21, 2010 - 12:03 pm

    Interesting article and from my business experience I agree that all to often we have a habit of ignoring evidence that contradicts our mental models, theories, etc. However this also puts a cynical perspective on the scientific method that is unwarranted.
    For example I recall in my early physics classes we performed experiments trying to demonstrate Coulomb’s Law (the force between two charges is inversely proportional to the square of the distance between the two charges). Being able to reproduce the results accurately was exceedingly difficult even though our equipment was several centuries more advanced. I can think of several other instances of trying to reproduce experiments of well understood phenomena. These experiences reinforced the fact that scientists have to have a combination of dogged persistence and an open mind as well.
    As Richard Feynman said, “It doesn’t matter how beautiful your theory is, it doesn’t matter how smart you are. If it doesn’t agree with experiment, it’s wrong.”
    Or think of a quote by Edison, “If I find 10,000 ways something won’t work, I haven’t failed. I am not discouraged, because every wrong attempt discarded is another step forward.”
    In a recent blog post of mine I speak of Leonardo da Vinci the inventor of the scientific method and a practitioner of Gemba.
    His approach to learning combined considerable observation yet he did create his theories, modified them with new learning, and continued to add to his knowledge.
    So while I agree bias towards one’s theories can be a stumbling block even in the light of new evidence the scientific method when applied diligently and with an open mind can work wonders.

  5. mike

    January 25, 2010 - 5:56 am

    “Faced with the choice between changing one’s mind and proving that there is no need to do so, almost everybody gets busy on the proof.”
    Sounds sensible, i cant change my mind every time i hear a new fad or marketing blurb.