Evidence and outcomes

Issue: BCMJ, vol. 47, No. 2, March 2005, Page 76 Editorials

Be not the first by whom the new is tried, nor yet the last to lay the old aside.

Alexander Pope 1688–1744

The road of scientific progress is frequently blocked for long periods by such errors. (Referring to the stubborn acceptance of existing beliefs)

Albert Einstein 1879–1955

Has the pendulum swung too far in our quest and even passion for “evidence-based outcome studies”? Clinicians have been criticized and at times even ridiculed for their forays into new areas. I used to tell residents that none of the surgical techniques they learn are statistically validated and none would stand up to the requirements of the FDA or Health Canada for the introduction of drugs. Clearly there is a need for clinical trials and evaluation, and over 30 years ago I was a participant in a study that was, to my knowledge, the first prospective, randomized, double-blind study ever initiated and published by surgeons.[1] Later, I co-authored one of the first randomized prospective double-blind studies to be published in an orthopaedic journal.[2] Both studies involved drug trials. Trials of new techniques are needed if they can be compared to truly inert sham or placebo treatments or to an existing conventional treatment. Interventional treatments are often subject to rapid improvements and changes in technology. Variation in technical skill is a complicating factor and control is attempted by including enough operators and patients that variables are accounted for. Unfortunately, theory and reality often differ in those attempts. In 1972, at the Royal Society of Medicine in London, a highly skilled surgeon presented his experience with Polya gastrectomy. His excellent results with measurable outcomes—such as “dumping syndrome,” infections, anastomotic leakage, recurrence, and other complications were unmatched by others. The fact is that he—and many whom he trained—were simply better at the operation. We need to balance a duty to ensure that patients do not receive unproven treatment with the need to not discourage innovation. We must avoid bias, prejudice, or lack of due diligence in evaluating new treatments. History is full of examples where new advances, theories, or knowledge were rejected—Arrhenius, Galileo, the Wright brothers, and Crick and Watson to name just a few. The latter’s Nature paper had no experimental validation, consisting of hypotheses but citing no authorities. The paper was rejected by Harvard University Press. Similar skepticism after the invention of the telephone was even propagated by Alexander Graham Bell’s father. When Fred Smith, the founder of Federal Express, prepared its business plan while an undergraduate at Yale, he was given a “C” grade with comments that a plan must be feasible to earn a higher grade!

Almost 25 years ago, when I presented the first Canadian series on arthroscopic surgery at the Canadian Orthopaedic Association, the appointed discusser questioned my veracity. Some of my own colleagues even complained to our College that I was making false claims about its advantages. It was satisfying that several who complained later signed up for my courses in arthroscopic surgery, and the Toronto-based critic of my first paper went on to offer trainee fellowships in arthroscopic surgery. The design of prospective randomized double-blind controlled trials that include placebo surgery creates ethical dilemmas. Perhaps surgical trials are only feasible if procedures are compared to existing treatments, even when those treatments had never been validated. In the example of arthroscopic surgery, the main difference from existing techniques is that the surgery is less invasive and painful. It is difficult to justify a trial when patients walk home and often require no analgesics with one procedure, but are hospitalized, receive narcotics, and go home on crutches with the other. In my own specialty, we observed similar rejection and skepticism with respect to joint replacement and internal fixation of fractures. The introduction and acceptance of new technologies in general surgery and gynecology (laparoscopy), radiology (MRI, CT, and PET scans) and urology (lithotripsy) was criticized, delayed, or withheld for similar reasons. There are numerous examples in other specialties and even outside of medicine. The fact that there is no valid evidence that parachutes are effective is the subject of a satirical paper calling for proper controlled trials of their use.[3]

There is a trend to encourage a new breed of clinician scientist, trained and still involved in the reality of clinical medicine, but also familiar with both the importance and limitations of statistical methods. In protecting patients from potentially harmful or ineffective interventions, we must be careful not to deny them timely access to new treatments that are less invasive, more effective, and less dangerous than those entrenched in traditional medicine.

—BD


References

1. Taylor TV, Rimmer S, Day B, et al. Ascorbic acid supplementation in the treatment of pressure sores. Lancet 1974;2(7880):544-546. PubMed Abstract
2. Chirwa SS, Day B, McLeod B: Intra-articular bupivacaine (Marcaine) after arthroscopic meniscectomy: A randomized double-blind controlled study. Arthroscopy 1989;5-1:33-35. PubMed Abstract
3. Smith GC and Pell JP. Parachute use to prevent death and major trauma related to gravitational challenge: a systematic review of randomised controlled trials. BMJ 2003;327:1459-1461. PubMed Abstract Full Text

 

Brian Day, MB. Evidence and outcomes. BCMJ, Vol. 47, No. 2, March, 2005, Page(s) 76 - Editorials.



Above is the information needed to cite this article in your paper or presentation. The International Committee of Medical Journal Editors (ICMJE) recommends the following citation style, which is the now nearly universally accepted citation style for scientific papers:
Halpern SD, Ubel PA, Caplan AL, Marion DW, Palmer AM, Schiding JK, et al. Solid-organ transplantation in HIV-infected patients. N Engl J Med. 2002;347:284-7.

About the ICMJE and citation styles

The ICMJE is small group of editors of general medical journals who first met informally in Vancouver, British Columbia, in 1978 to establish guidelines for the format of manuscripts submitted to their journals. The group became known as the Vancouver Group. Its requirements for manuscripts, including formats for bibliographic references developed by the U.S. National Library of Medicine (NLM), were first published in 1979. The Vancouver Group expanded and evolved into the International Committee of Medical Journal Editors (ICMJE), which meets annually. The ICMJE created the Recommendations for the Conduct, Reporting, Editing, and Publication of Scholarly Work in Medical Journals to help authors and editors create and distribute accurate, clear, easily accessible reports of biomedical studies.

An alternate version of ICMJE style is to additionally list the month an issue number, but since most journals use continuous pagination, the shorter form provides sufficient information to locate the reference. The NLM now lists all authors.

BCMJ standard citation style is a slight modification of the ICMJE/NLM style, as follows:

  • Only the first three authors are listed, followed by "et al."
  • There is no period after the journal name.
  • Page numbers are not abbreviated.


For more information on the ICMJE Recommendations for the Conduct, Reporting, Editing, and Publication of Scholarly Work in Medical Journals, visit www.icmje.org

BCMJ Guidelines for Authors

Leave a Reply