Years ago, when the phrase “evidence-based medicine” was first being popularized, my friend and colleague Dr Bob Meek remarked, “What’s new about that?” He was correct; there was nothing new about the idea. In my experience, doctors have always tried to manage patients based on the best available evidence.
David Sackett, credited with popularizing evidence-based medicine defined it as the “conscientious, explicit, and judicious use of current best evidence in making decisions about the care of individual patients.” Who could argue with that?
One group that seeks to impose unproven top-down protocols on clinicians calls itself EvidenceNetwork.ca. They show how one can hijack a theoretically good concept and manipulate the model to create what they believe are valid guidelines pertaining to clinical treatments in which they have little or no expertise. That particular group lists nonclinical self-proclaimed experts who offer “expert guidance” in those areas.
I believe there are two kinds of experts: those who are biased and know it, and those who are biased and don’t know it. Few are capable of distinguishing between their own prejudiced beliefs and factual evidence. I subscribe to the view of theoretical physicist and quantum mechanics scientist, Richard Feynman, who wrote, “Science is the belief in the ignorance of experts.”
History tells us experts are often wrong but seldom in doubt. Experts at Decca Records rejected the Beatles in 1962, stating, “We don’t like their sound, and guitar music is on the way out.” In 1927, Warner Brothers asked, “Who the hell wants to hear actors talk?” In 1974, Margaret Thatcher pronounced, “It will be years—not in my time—before a woman will become prime minister.”
Some, naively, believe that their self-generated consensus opinions represent evidence and most, if not all, have inherent biases. The world literature cannot reassure us on the value of published expert evidence, even when peer reviewed.
There are a number of research scientists that know and understand the limitations in their field. I have been involved in randomized, prospective, double-blind studies, and I support their application where applicable.[1-2] I favor properly designed trials and objective studies when feasible, and I am lucky enough to have worked with some who exhibit scientific objectivity and understand the clinical role in analyzing research.
The BCMJ is a peer-reviewed journal. We do our best to be objective, but it is vital that we recognize the deficiencies that exist in the process. In 1998, the editor of the British Medical Journal sent an article containing eight deliberate mistakes in design and analysis to over 200 peer reviewers. On average, the reviewers picked up less than two of the eight errors.
If written descriptions outlining Columbus’s experiences in the Americas and Darwin’s theory of evolution had undergone peer review by experts, both accounts would likely have been rejected as fantasy. Scientists at Amgen, an American drug company, could replicate only 6 of 53 published studies considered landmarks in cancer science, and John Ioannidis from Stanford has declared that most published research findings are probably false.
So what is the basis for our acceptance of evidence? One widely used approach is the concept of “null hypothesis.” Data are collected and calculations made to determine significance. A P value below 0.05 (1 in 20), implies statistical significance. This is a commonly used and abused test in experimental studies and peer review. A famous example of harm, in a case involving thousands, occurred in the early 2000s when many Vioxx users died as a result of excessive faith in the so-called 5% rule of statistical significance.
There is a strong case for decision making based on best practices and evidence-based decision making. But that evidence needs to be validated by clinical outcomes and not be dependent on self-appointed overseers who lack appropriate expertise and knowledge in terms of clinical outcomes.
1. Taylor TV, Rimmer S, Day B, et al. Ascorbic acid supplementation in the treatment of pressure sores. Lancet 1974;2:(7880):544-546.
2. Chirwa SS, MacLeod BA, Day B. Intraarticular bupivacaine (Marcaine) after arthroscopic meniscectomy: A randomized double-blind controlled study. Arthroscopy 1989;5:33-35.
3. Chambers GK, Schulzer M, Sobolev B, Day B. Degenerative arthritis, arthroscopy and research. Arthroscopy 2002;18:686-687.
Above is the information needed to cite this article in your paper or presentation. The International Committee
of Medical Journal Editors (ICMJE) recommends the following citation style, which is the now nearly universally
accepted citation style for scientific papers:
Halpern SD, Ubel PA, Caplan AL, Marion DW, Palmer AM, Schiding JK, et al. Solid-organ transplantation in HIV-infected patients. N Engl J Med. 2002;347:284-7.
About the ICMJE and citation styles
The ICMJE is small group of editors of general medical journals who first met informally in Vancouver, British Columbia, in 1978 to establish guidelines for the format of manuscripts submitted to their journals. The group became known as the Vancouver Group. Its requirements for manuscripts, including formats for bibliographic references developed by the U.S. National Library of Medicine (NLM), were first published in 1979. The Vancouver Group expanded and evolved into the International Committee of Medical Journal Editors (ICMJE), which meets annually. The ICMJE created the Recommendations for the Conduct, Reporting, Editing, and Publication of Scholarly Work in Medical Journals to help authors and editors create and distribute accurate, clear, easily accessible reports of biomedical studies.
An alternate version of ICMJE style is to additionally list the month an issue number, but since most journals use continuous pagination, the shorter form provides sufficient information to locate the reference. The NLM now lists all authors.
BCMJ standard citation style is a slight modification of the ICMJE/NLM style, as follows:
- Only the first three authors are listed, followed by "et al."
- There is no period after the journal name.
- Page numbers are not abbreviated.
For more information on the ICMJE Recommendations for the Conduct, Reporting, Editing, and Publication of Scholarly Work in Medical Journals, visit www.icmje.org