A light look at some heavy mistakes
What is knowledge, and how do we know what we know? These are questions that have preoccupied philosophers for millennia.
Modern science relies heavily on the empiric method, making use of experience and the reproducibility of events to accept or reject hypotheses. Medicine prides itself on this, and with good cause: This is one of the reasons why the populations of industrialized countries are significantly healthier and live much longer than those of developing countries.
Despite the advances that medicine has made over the last two centuries, a sizable number of people still do not accept the primacy of the empiric method in determining whether a hypothesis is valid. While this may be less of a concern when addressing questions such as the number of days versus billions of years it took to create the universe, it can have life-or-death implications when trying to decide how best to treat an ailing patient.
Ben Goldacre, a British physician and author, has written a very funny and biting book critiquing what he calls “Bad Science.’’ Under this heading he includes homeopathy, cosmetics manufacturers whose claims about their products defy plausibility, proponents of miracle vitamins, and drug companies and physicians who design faulty studies and manipulate the results.
Explaining why empiric and not just anecdotal evidence is necessary to prove cause and effect, he asks, “Does the rooster’s cry cause the sun to rise? No. Does this light switch make the room get brighter? Yes.’’ But to his chagrin, he finds many who refuse to accept the similarity between the example of the rooster and the sun and the beliefs they hold dear.
This leads him to concede the difficulty of getting people to see the faults in their logic. “You cannot reason people out of positions they didn’t reason themselves into,’’ Goldacre writes.
He notes that when confronted with the irrationality of their beliefs, “instead of addressing the criticisms, or embracing the new findings in a new model, they seem to shift the goalposts and retreat, crucially, into untestable positions.’’
The inability to distinguish between fact and fantasy can have grave outcomes. Willful manipulation of the results of poorly designed studies, cherry picking of data (stressing positive outcomes and spinning or ignoring negative ones), and outright scientific fraud have all resulted in poor outcomes for patients, including the worsening of disease and even death.
Likewise, the unwillingness of people in positions of power to expose their beliefs to scientific examination has led to catastrophic results, such as occurred in South Africa in the first years of the last decade when the government refused to provide antiretroviral drugs to treat AIDS because of a belief that they were less effective than “beetroot, garlic, lemons, and African potatoes.’’ The resulting delay in providing appropriate treatment from 2000-2005 has been estimated as causing the early deaths of 350,000 South Africans.
“If I weren’t writing a light and humorous book about science right now, I would descend into gales of rage,’’ Goldacre writes. His frustration is keenly felt. While it is a very entertaining book, it also provides important insight into the horrifying outcomes that can result when willful anti-intellectualism is allowed equal footing with scientific methodology.
Dennis Rosen, a pediatric lung and sleep specialist at Children’s Hospital Boston, can be reached at firstname.lastname@example.org.