Talk of transparency is easy, but it’s up to journalists to hold hospitals accountable

Author(s)
Published on
July 5, 2016

The Centers for Medicare and Medicaid Services (CMS) recently delayed a plan to issue a “star” rating of individual hospitals’ care after four hospital trade groups organized a letter from 60 senators and 225 representatives questioning the methodology.

The industry was right: A single “star” rating can’t accurately characterize diverse surgical and medical operations. Their success, however, raises a tougher question, as I wrote in a recent Health Affairs Blog post. That’s whether hospitals will voluntarily do all they can without government prodding to make ratings more “useful and helpful” for patients, as Rick Pollack, president and chief executive officer of the American Hospital Association (AHA), promised in a press release.

Mewling about Medicare is easy. It's up to journalists to hold local hospitals accountable for matching a professed commitment to transparency with concrete actions. That can be difficult. If you’re ProPublica, Consumer Reports or U.S. News & World Report, you try to produce your own comparative information, then grab a bayonet and head off to methodological trench warfare with critics. I sympathize: I worked with a group of experts to produce a report card on patient safety by Congressional district and am familiar with many of the challenges.

I’d like to suggest that for local journalists there’s an easier way. The information hospital execs and physicians want for themselves and what journalists want for the public is in many ways identical: accurate, comparable, timely and actionable data. With apologies to the Watergate hearings, your job is to ask local hospitals, “What do you know, and when do you know it?” ­— and then press for as much of that information as possible to be publicly shared.

Here’s a quick example. The hospital closest to my home is part of a four-hospital system. I have no doubt the system’s executives can see performance information for each individual facility, but the system, like many, reports quality information to Medicare using a single provider identification number. If the individual hospital data is good enough for the system’s executives, it’s good enough to voluntarily share with those of us living in the communities the hospital services. And, if you’re a community benefit nonprofit hospital, why wouldn’t you want to?

The information hospital execs and physicians want for themselves and what journalists want for the public is in many ways identical: accurate, comparable, timely and actionable data. With apologies to the Watergate hearings, your job is to ask local hospitals, “What do you know, and when do you know it?” ­— and then press for as much of that information as possible to be publicly shared.

Similarly, we know there’s often a connection between higher surgical volume and better outcomes. If the community has to wait for the government to process hospital-submitted data, it may be late 2016 before you see information from 2015. However, local hospitals could voluntarily make public the number of surgeries they perform every quarter — particularly for common procedures such as hip and knee replacement or cardiac surgery — on a rolling one- or two-year basis, as soon as the latest information is available. That voluntary sharing would include rural and urban “safety net” hospitals that may be exempt from some government reporting, but still owe an equal moral obligation on transparency to their communities.

Journalists might start with a request (perhaps with a FOIA attached) to a state- or county-owned facility, and then use that information to encourage similar disclosure by others. You might also enlist employers and community groups (perhaps cancer survivors?) as allies. Not incidentally, using straightforward measures of care “processes,” such as the percentage of cardiac patients given aspirin, staves off arguments about whether bad outcomes were due to a particular hospital’s patients being in poorer health to start with.

Of course, it would be even better if hospitals voluntarily released information on the number of procedures individual surgeons perform. As I’ve noted, New York State data on individual cardiac surgeons emerges with a two- to three-year lag, making it of marginal usefulness. Still, if you’re a journalist in New York State, or other states where similar information is eventually available, you could push for hospitals to make that information publicly available far sooner. To repeat: If information is reliable enough for hospitals’ internal use, it’s reliable enough to share with the public while it’s still relevant.

At the very least, hospitals should be held accountable for the timeliness, clarity and comprehensiveness of information on their own websites. For example, Boston is home to some of world’s most prominent health care quality and safety scholars, yet the information on the sites of the institutions where these individuals are credentialed is often sparse, confusing and outdated. In the Chicago area, one university hospital disclosed two wrong-site surgeries deep within the “safety and quality” portion of its website roughly two years after they occurred.

Prestigious hospitals should also be pressed to use clinical comparisons relevant to their true competitors, other elite hospitals, not merely the average performance of all hospitals in the state. For certain procedures, such as heart surgery, academic medical centers almost always have some of that comparative information. (For an example, see this Mayo Clinic page comparing mortality rates at its different sites to the average of a consortium of academic medical centers.)

Indeed, surveying local institutions to find out, “What do you know and when do you know it?” could itself produce interesting and important stories for journalists. We’ve learned that hospitals are often in the dark about their own costs and prices, but what do they really know, and what should they know but don’t, about the safety and quality of the clinical care that is at the heart of their mission?

One indicator of the intensity of hospitals’ attention to clinical performance could be the rate of central line-associated bloodstream infections (CLABSIs). These are a dangerous, often fatal and highly expensive infection. Data reported by hospitals to the Centers for Disease Control and Prevention takes many months before it appears in a public CMS report. Moreover, the formula used to characterize how well hospitals are preventing such infections is complex.

Here’s what you need to know: A major effort among Michigan hospitals that began in 2008 showed that CLABSIs in intensive care units could often be reduced to near zero within 18 months. Hundreds of hospitals nationally are meeting that standard less than a decade later. It doesn’t take expensive technology, just a relentless, consistent commitment. Do your local hospitals have that, or are their priorities elsewhere?

As I’ve written before, Kentucky’s Norton Healthcare reports almost 600 nationally recognized quality indicators for the system and individual facilities. The effort began in 2005 and includes core principles such as, “We do not decide what to make public based on how it makes us look.” Why, exactly, can’t your local hospitals do the same?

Like Norton, other hospitals need to demonstrate that their default report card is not “straight A's” or a discrete silence. There is, after all, a difference between “First do no harm to my reputation” and “First do no harm to patients.”

Michael L. Millenson is the president of Health Quality Advisors LLC in Highland Park, Ill. and a former health care reporter for The Chicago Tribune.