Over at Scientific American Mind Gerd Gigerenzer and his colleagues have published a terrific article documenting the statistical illiteracy that sometimes runs rampant in health care fields. The article, "Knowing Your Chances," appears in the April/May/June 2009 edition.
The authors point out numerous medical care fallacies caused by statistical illiteracy , including Rudy Giuliani's 2007claim that because 82% of Americans survived prostate cancer, compared to only 44% in England, that he was lucky to be living in the United States and not in England. This sort of claim is based on Giuliani's failure to understand statistics. Yes, in the United States, men will be more quickly diagnosed as having prostate cancer (because many more of them are given PSA tests), and then many more of them will be treated. Despite the stark differences in survival rates (the percentage of patients who survive the cancer for a least five years, "mortality rates in the two countries are close to the same: about 26 prostate cancer deaths per 100,000 American men versus 27 per 100,000 in Britain. That fact suggests the PSA test
has needlessly flagged prostate cancer in many American men, resulting in a lot of unnecessary surgery and radiation treatment, which often leads to impotence or incontinence. Because of overdiagnosis and lead-time bias, changes in five-year survival rates have no reliable relation to changes in mortality when patterns of diagnoses differ. And yet many official agencies continue to talk about five-year survival rates.
Gigerenzer and his colleagues give a highly disturbing as example regarding mammogram results. Assume that a woman just received a positive test result (suggesting breast cancer) and asks her doctor "What are the chances that I have breast cancer?" In a dramatic study researchers asked 160 gynecologists taking a continuing education course to give their best estimate based upon the following facts:
A.) the probability that a woman has breast cancer (prevalence) is 1%
B.) if a woman has breast cancer the probability that she tests positive (sensitivity) is 90%
C) if a woman does not have breast cancer, the probability that she nonetheless tests positive (false-positive rate) is 9%
The best answer can be quickly derived from the above three statements. Only about one out of 10 women who test positive actually has breast cancer. The other 9/10 have been falsely diagnosed. Only 21% of physicians picked the right answer. 60% of the gynecologists believed that there was either an 81% or 90% chance that a woman with a positive test result actually had cancer, suggesting that they routinely cause horrific and needless fear in their patients.
What I found amazing is that you can quickly and easily determine that 10% is a correct answer based upon the above three statements--simply assume that there are 100 patients, that one of them (1%) actually has breast cancer and that nine of them (9%) test false positive. This is grade school mathematics: only about 10% of the women testing positive actually have breast cancer.
As the article describes, false diagnosis and bad interpretations often combine (e.g., in the case of HIV tests) to result in suicides, needless treatment and immense disruption in the lives of the patients.
The authors also discuss the (tiny) increased risk of blood clots caused by taking third-generation oral contraceptives. Because the news media and consumers so often exhibit innumeracy, this news about the risk was communicated in a way that caused great anxiety. People learned that the third-generation pill increased the risk of blood clots by "100%." The media should have pack is aged the risk in a more meaningful way: whereas one out of 7000 women who took the second-generation pill had a blood clot, this increased to two in 7000 women who took the new bill. The "absolute risk increase" should have been more clearly communicated.
Check out the full article for additional reasons to be concerned about statistical illiteracy.