I recently wrote an article for a community newspaper attempting to explain to scientifically naive readers why testimonial “evidence” is unreliable; unfortunately, they decided not to print it. I considered using it here, but I thought it was too elementary for this audience. I have changed my mind and I am offering it below (with apologies to the majority of our readers), because it seems a few of our readers still don’t “get” why we have to use rigorous science to evaluate claims. People can be fooled, folks. All people. That includes me and it includes you. Richard Feynman said
The first principle is that you must not fool yourself–and you are the easiest person to fool.
Science is the only way to correct for our errors of perception and of attribution. It is the only way to make sure we are not fooling ourselves. Either Science-Based Medicine has not done a good job of explaining these vital facts, or some of our readers are unable or unwilling to understand our explanations.
Our commenters still frequently offer testimonials about how some CAM method “really worked for me.” They fail to understand that they have no basis for claiming that it “worked.” All they can really claim is that they observed an improvement following the treatment. That could indicate a real effect or it could indicate an inaccurate observation or it could indicate a post hoc ergo propter hoc error, a false assumption that temporal correlation meant causation. Such observations are only a starting point: we need to do science to find out what the observations mean.
Last week one of our commenters wrote the worst testimonial yet:
I have witnessed first hand the life that begins to flow through the body upon the removal of a subluxation.
What does this even mean? Does he expect anyone to believe this just because he says it? Would he believe me if I said I had witnessed first hand the invisible dragon in Carl Sagan’s garage?
Another commenter wrote
Those pro SBM commenters here seem to think that even if they see something with their own eyes that they can’t believe it if there are no double blinded officially published studies to prove that what they saw actually happened.
Well, yes, that’s pretty much what we do think; and we are appalled that you don’t understand it yet, since it’s the whole reason we have to do science. I would phrase it a bit differently: “Seeing something with my own eyes doesn’t prove it’s true and it doesn’t preclude the necessity for scientific testing.”
We can’t believe our own eyes. The very process of vision itself is an interpretive construct by the brain. There are two blind spots in our field of vision and we aren’t even aware of them. I saw a magician cut a woman in half on stage – that was an illusion, a false perception. I saw a patient get better after a treatment – my interpretation that the treatment caused the improvement may have been a mistake, a false attribution.
So for those who still don’t get it, here’s my simplistic article:
—————————————
Sometimes We Get It Wrong
How can you know whether a medical treatment really works? If everybody says it works, and it worked for your Aunt Sally, and you try it and your symptoms go away, you can pretty well assume it really works. Right?
No, you can’t make that assumption, because sometimes we get it wrong. For many centuries doctors used leeches and lancets to relieve patients of their blood. They KNEW bloodletting worked. Everybody said it did. When you had a fever and the doctor bled you, you got better. Everyone knew of a friend or relative who had been at death’s door until bloodletting cured him. Doctors could recount thousands of successful cases.
All those people got it wrong. When George Washington got a bad throat infection, his doctors removed so much of his blood that his weakened body couldn’t recover, and he died. We finally got around to testing bloodletting and found out it did much more harm than good. Patients who got well had been getting well IN SPITE of bloodletting, not because of it. And some patients had died unnecessarily, like George Washington.
Even modern doctors sometimes get it wrong. Not long ago, doctors used to do an operation for heart disease where they opened the chest and tied off chest wall arteries to divert more blood flow to the heart. They had an impressive 90% success rate. A smart doctor named Leonard Cobb wanted to make sure, so he did an experiment where he just made the incision in the chest and closed it back up without actually doing anything. He discovered that just as many patients improved after the fake surgery! Doctors stopped doing that operation.
How could so many people be so wrong? How could they believe something had helped them when it actually had done them more harm than good? There are a whole lot of reasons people can come to believe an ineffective treatment works.
- The disease may have run its natural course. A lot of diseases are self-limiting; the body’s natural healing processes restore people to health after a time. A cold usually goes away in a week or so. To find out if a cold remedy works, you have to keep records of successes and failures for a large enough number of patients to find out if they really get well faster with the remedy than without it.
- Many diseases are cyclical. The symptoms of any disease fluctuate over time. We all know people with arthritis have bad days and good days. The pain gets worse for a while, then it gets better for a while. If you use a remedy when the pain is bad, it was probably about to start getting better anyway, so the remedy gets credit it doesn’t deserve.
- We are all suggestible. If we’re told something is going to hurt, it’s more likely to hurt. If we’re told something is going to make it better, it probably will. We all know this: that’s why we kiss our children’s scrapes and bruises. Anything that distracts us from thinking about our symptoms is likely to help. In scientific studies that compare a real treatment to a placebo sugar pill, an average of 35% of people say they feel better after taking the sugar pill. The real treatment has to do better than that if we’re going to believe it’s really effective.
- There may have been two treatments and the wrong one got the credit. If your doctor gave you a pill and you also took a home remedy, you may give the credit to the home remedy. Or maybe something else changed in your life at the same time that helped treat the illness and that was the real reason you got better.
- The original diagnosis or prognosis may have been incorrect. Lots of people have supposedly been cured of cancer who never actually had cancer. Doctors who tell a patient he only has 6 months to live are only guessing and can guess wrong. The best they can do is say the average patient with that condition lives 6 months – but average means half of people live longer than that.
- Temporary mood improvement can be confused with cure. If a practitioner makes you feel optimistic and hopeful, you may think you feel better when the disease is really unchanged.
- Psychological needs can affect our behavior and our perceptions. When someone wants to believe badly enough, he can convince himself he has been helped. People have been known to deny the facts – to refuse to see that the tumor is still getting bigger. If they have invested time and money, they don’t want to admit it was wasted. We see what we want to see; we remember things the way we wish they had happened. When a doctor is sincerely trying to help a patient, the patient feels a sort of social obligation to please the doctor by improving.
- We confuse correlation with causation. Just because an effect follows an action, that doesn’t necessarily mean the action caused the effect. When the rooster crows and then the sun comes up, we realize it’s not the crowing that made the sun come up. But when we take a pill and then we feel better, we assume it was the pill that made us feel better. We don’t stop to think that we might have felt better for some other reason. We jump to conclusions like the man who trained a flea to dance when it heard music, then cut the flea’s legs off one by one until it could no longer dance and concluded that the flea’s organ of hearing must be in its legs!
So there are lots of ways we can get it wrong. Luckily, there’s a way we can eventually get it right: by scientific testing. There’s nothing mysterious or complicated about science: it’s just a toolkit of common sense ways to test things. If you believe you’ve lost weight and you step on the scale to test your belief, that’s science. If you think you have a better way to grow carrots and you test your idea by planting two rows side by side, one with the old method and one with the new method, and see which row produces better carrots, that’s science. To test medicines, we can sort a large number of patients into two equal groups and give one group the treatment we’re testing and give the other group an inert placebo, like a sugar pill. If the group that got the active treatment does significantly better, the treatment probably really works.
Jacqueline Jones was a 50 year old woman who had suffered from asthma since the age of 2. She read about a miraculous herbal treatment that cured a host of ailments including asthma. She assumed the information was true, because it included lots of testimonials of people who had used it and were able to stop taking their asthma medications. They KNEW it worked. They had SEEN it work. Sick of the side-effects of conventional drugs, Jacqueline stopped using her three inhalers, steroids and nebulizer, and took the herbal supplement instead. Within two days she was in the hospital.
I had a massive asthma attack. I was seriously ill in hospital for six weeks, during which I developed pleurisy. I couldn’t breathe and my lungs were so sensitive that even touching the area on the outside felt like someone was kicking me.
All those people who said that herbal remedy had cured their asthma got it wrong. Asthma symptoms fluctuate. Maybe their symptoms would have improved anyway. Whatever the reason, the remedy had not been tested scientifically and it was not effective for treating asthma, and believing those testimonials almost cost Jacqueline her life.
The next time a friend enthusiastically recommends a new treatment, stop and remember that they could be wrong. Remember Jacqueline Jones. Remember George Washington. Sometimes we get it wrong.
Note: Several people in other countries liked this article and wanted to share it in their native languages. For a Spanish translation of this article, click here. It was also translated into German, Turkish, and Dutch, but those webpages are no longer available.
This article was originally published in the Science-Based Medicine Blog.