Checklist Help Debunked?

We're always astonished, Rick and I opine on PodMed this week, when sacred cows are slaughtered in medicine, when some precept we've held dear is subjected to rigorous study and discredited.  Such is the case in this week's New England Journal of Medicine, where surgical checklists, those darlings of the patient safety world, had no impact on mortality or surgical complications.  Sacre bleu!  How can it be so?  Let's take a look at the evidence.

Our friends to the north, residents of Ontario, Canada, provided researchers with a 'natural experiment,' a plethora of data from acute care hospitals gathered before and after mandatory implementation of surgical safety checklists. The Ministry of Health and Long Term Care mandated use of surgical safety checklists beginning in July 2010, and hospitals could employ a list of their own devising, the WHO checklist (previously validated in a number of observational studies) or the Canadian Patient Safety Institute checklist. Each hospital is required to report compliance with surgical safety checklists to a publicly reported database, in which the hospital is individually identified.

Three month intervals were studied for all 133 surgical hospitals in Ontario, one concluding three months before checklist implementation, and one commencing three months after such a list was employed.  All surgical procedures performed during each period of study were included in the analysis. Outcome measures included operative mortality, defined as mortality occurring during hospitalization or within 30 days of the procedure, complications occurring within 30 days of surgery, length of hospital stay, rate of readmission within 30 days of discharge, and emergency department visits within 30 days of discharge.

Comorbidities, the patient's socioeconomic status, sex, age and several other factors were also considered. A total of 101 hospitals were deemed eligible for analysis, revealing an adjusted risk of death of 0.71% before checklist implementation to 0.65% afterward. Other outcome measures were similarly unaffected by utilization of a surgical safety checklist, with such a paucity of impact persisting with multiple means of analysis and utilization of different factors.  Well.  Seems a pretty supportable conclusion.  What are possible explanations?

Compliance could be an issue, the authors indicate, citing previous studies looking at actual rates of compliance and outcomes and showing a linear relationship between the two.  Training prior to implementation has also been shown to help, as does team training.  The authors also invoke the Hawthorne effect, where people who know their work is being scrutinized perform better than under normal conditions.  They also speculate that previous research demonstrating the very significant impact of checklists may have relied upon extensive checklists covering virtually the entire period prior to admission to discharge, or when checklists are implemented along with extensive training of care teams.  Finally, I would cite our colleague Peter Pronovost, a patient safety guru, who told me that once one source of potential errors is minimized or eliminated, it opens the door for other types of errors, an assertion borne out in research in many areas of safety.  For now, Rick and I agree, it seems very unlikely that such lists will be abandoned and perhaps this research will suggest a fruitful area for further endeavor.

Other topics this week include a new blood test for Alzheimer's disease in Nature Medicine, questions about whole genome sequencing in JAMA, and the problem of discontinued randomized clinical trials in the same journal.  Until next week, y'all live well.

VN:F [1.9.17_1161]
Rating: 0.0/5 (0 votes cast)
4 Comments

{ 4 comments… read them below or add one }

Comments

Douglas Crew March 20, 2014 at 4:23 pm

Ms. Tracey;
I echo the comments of Mr. Drayton (boy, I wish all commentors could take a lesson from you!) - very nicely reported.
I've had first hand experience with the changes that checklists can make in medicine, as well as other areas of life. One thing I've always struggled with (I was an early believer in Gwande's "Checklist Manifesto) was how to implement checklists both at work and in my personal life. Do you know of any sites you'd recommend? I have started using http://www.simplist.me which seems pretty good, but have been perplexed that nobody every talks about how to actually use checklists once we're bought into the idea. 🙂
Again, nice work, and thank you!

Reply

Elizabeth Tracey March 21, 2014 at 7:57 am

Thanks for your kind comments, Douglas. I don't know of any sites that demonstrate this but am aging into the necessity of using them in many aspects of my own life! If you hear of any I'd be happy to be informed, and thanks again for writing.

Reply

Herbert L. Drayton III March 15, 2014 at 12:49 pm

This was a measured and responsible post. As I posted to another blog that attempted to marginalize the impact of checklists, the study had blaring gaps. How exactly do you measure the marginal benefit of the use of checklists? One life saved is statistically significant. We introduced checklists into our operations where we have more than 30,000 patient encounters on a monthly basis. Checklists work.

It’s tough to measure “moving accountability to the lowest possible level” in an organization. How do you quantify a staff nurse saying “I forgot” after the start of a procedure? The culture change required for the use of checklists should not be confined to surgical processes; it should include the entire organization.

Thanks for being a thought leader and not a blind follower.

Reply

Elizabeth Tracey March 15, 2014 at 4:28 pm

Thank you so much for your thoughtful comment. One thought about this study we see again and again is the observational versus the more robust types of studies. Clearly you are familiar with the checklist story and know that the data from observational studies looked so robust, and it appealed to our intuitive bias that wholesale adoption was widespread. Yet as in so many cases, what seemed intuitively obvious often doesn't pan out when subjected to more rigorous study, and perhaps that is the case here. Yet again the answer is no doubt more study! We'll be watching.

Reply

Leave a Comment

You can use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

Johns Hopkins Medicine does not necessarily endorse, nor does Johns Hopkins Medicine edit or control, the content of posted comments by third parties on this website. However, Johns Hopkins Medicine reserves the right to remove any such postings that come to the attention of Johns Hopkins Medicine which are deemed to contain objectionable or inappropriate content.

Previous post:

Next post: