Skip to main content.

Parkland in Dallas has been among Texas’ worst hospitals for patient safety for years, analysis shows

Fellowship Story Showcase

Parkland in Dallas has been among Texas’ worst hospitals for patient safety for years, analysis shows

Picture of Ryan McNeill
Most other hospitals in N. Texas also lag on key measure, analysis shows
Dallas Morning News
Sunday, October 16, 2011

Parkland Memorial Hospital, now under U.S. government monitoring because of systemic failures in patient care, has for years been one of the state’s worst-performing hospitals on a broad federal measure of patient safety, a Dallas Morning News analysis shows.

Several other Dallas-area hospitals also ranked among the 10 worst large hospitals in Texas, including UT Southwestern University Hospital-St. Paul, which shares physicians with Parkland. Others were John Peter Smith Hospital in Fort Worth, Methodist Charlton Medical Center in southwest Dallas and Baylor Medical Center-Garland.

Only 10 of the 27 large hospitals in Dallas, Collin, Denton and Tarrant counties ranked above average on the patient safety measure, which was designed
to track problems such as surgical accidents and hospital- acquired infections. And only one — Texas Health Harris Methodist Southwest Fort Worth — scored among the state’s 10 best large general hospitals, those with 200 beds or more. Dallas County’s top performer was Texas Health Presbyterian Hospital Dallas.

Accuracy accepted

Hospital representatives contacted by The News accepted the accuracy of the calculations. But they questioned how well the data reflected actual performance and current hospital conditions.

A similar analysis, released Thursday by federal health regulators, also found that Parkland and St. Paul scored poorly. The federal report studied Medicare patients only, while The News used data covering all types of patients.

The newspaper spent six months analyzing nearly 9 million state hospital discharge records using Patient Safety Indicators, or PSI, software. This highly sophisticated system was designed for the federal government as a tool to measure potentially preventable complications among hospital patients.

The PSIs do not present a complete safety picture because they are based on administrative data — a summary of diagnoses, procedures and outcomes derived from patients’ medical charts, as opposed to a complete review of all medical records.

Medical experts who developed the system said it cannot be the sole basis for determining hospital quality but is the best publicly available tool for examining patient safety.

In general, said Dr. Patrick Romano of the University of California at Davis, “hospitals with higher rates are placing patients at higher risks overall.”

Romano is the lead physician on the PSI project, and he advised The News throughout its analysis. The newspaper focused on larger, full-service hospitals and calculated PSI rates for three years to guard against data fluctuations.

Parkland ranked among the worst five large hospitals in Texas in 2007, 2008 and 2009, the latest year for which complete data was publicly available. The
Dallas County public hospital has been the subject of several government safety inspections that found patients had been harmed or put in “immediate jeopardy.” In late September, it became the largest and only the fifth hospital in the nation forced to accept federal patient safety monitoring.

“There is no excuse for us to be at the bottom of anything,” said Dr. Lauren McDonald, chairwoman of Parkland’s governing board. “It’s so easy to say our patients are sicker, it’s so easy to say we have so many more patients, but that no longer is going to be an adequate excuse.”

UT Southwestern’s chief quality officer, Dr. Gary Reed, said the PSIs “give you a way to look in general terms at issues, so long as you don’t over-interpret them.” Both Reed and Parkland said their hospitals’ performances had improved on a number of measures since 2009, although they released only limited data to support that.

Hospitals use PSIs internally to gauge patient safety, identify problem areas and benchmark against their peers. But they don’t generally publicize their findings.
Patient safety advocates say the public release of PSI data is key to informed health care choices.

“These measures illustrate the harm that patients in Dallas hospitals are experiencing,” said Lisa McGiffert, national director of Consumers Union’s Safe Patient Project. “Since they identify preventable problems, scores in the worst 10 to 20 percent indicate where these hospitals urgently need to step up their efforts.”

Preventable problems

PSIs “reflect quality of care inside hospitals,” according to the Agency for Healthcare Research and Quality, a division of the U.S. Department of Health and Human Services. It released the PSI software in 2003 and periodically updates it,
most recently in August. The News used that version for its final analysis.

The software analyzes the administrative data that nearly every hospital in Texas reports to the state. No patient-identifying information is included.

The results on 15 PSIs are statistically “risk-adjusted” because some hospitals treat a disproportionate share of unhealthy patients, who face a greater risk of potentially preventable complications. Rates from eight of the indicators are used to determine a hospital’s patient safety “composite score.”

That score is a broad look at patient safety “to the extent that’s feasible with administrative data,” said Kathryn McDonald, executive director of Stanford
University’s Center for Health Policy/Center for Primary Care and Outcomes Research. She is the principal investigator for the government’s PSI software project and also advised The News on its analysis.

“There are [medical] complications that are not looked at with the PSIs because you can’t get them with administrative data,” McDonald said. But when a hospital performs significantly better or worse than others, “that’s a signal that there’s a systemic difference.”

“What we don’t know is whether they’re systemically different in terms of the way they document their care and code it up, the way that they’re delivering quality care or a patient mix that is not fully represented in the risk adjustment,” she said.

UTSW’s Reed agreed that the scores have value.

“I do believe in composite scores,” he said. But “they shouldn’t be looked at as a determinative measure of quality care.”

The Dallas-Fort Worth Hospital Council, a trade association, shares PSI data among its 75 member hospitals. But their results should remain private, council president W. Stephen Love said.

“To publish the names of hospitals in the media using only administrative [data] we feel is not fair and provides an incomplete picture,” he said.

Romano, the UC-Davis physician who helps develop the PSIs, said the imperfections do not merit keeping the information from the public.

“At the end of the day, we use the best information that we have and we try to use that information to inform stakeholders,” Romano said. “That’s what this is all about.”

Shared problems

TheNews’ findings raise particular questions about Parkland, which serves as a regional trauma center, and UTSW’s St. Paul. Both are taxpayer-funded institutions subject to Texas’ open-records law, so they must release more information than private hospitals, generally speaking. But they have not released data supporting their long-standing claims of excellent care.

Parkland and St. Paul are teaching hospitals, staffed largely by UTSW faculty and doctors in residency training. Both ranked poorly on many of the same patient safety indicators, such as accidental punctures and lacerations during procedures.

Their virtually identical composite scores were also among the state’s very worst, 64 percent higher than the combined composite for patients at all 105 large hospitals in Texas.

In 2009, Parkland was the worst-ranked large hospital in Texas for accidental punctures and lacerations during procedures. The analytical software found that Parkland had 172 potential cases that year, a risk-adjusted rate of 8.4 per 1,000 patients.

After The News shared its analysis, Parkland officials said the hospital’s performance had improved since 2009 on two patient safety indicators — bedsores
and lung punctures — as well as on the composite index. The hospital provided a few charts showing those improvements but did not release figures for other indicators or underlying data.

Parkland later posted on its website a report about its performance on the composite and 14 of the PSIs. From July 2010 to June 2011, Parkland said it was
“within target range” or better on the overall score and 10 individual indicators. It performed worse on the other four. The report did not explain the term “target range”; hospital officials later told TheNews it referred to the rate expected for an average U.S. hospital.

“We’re improving,” said Dr. Angelique Ramirez, Parkland’s patient safety officer. “We still have work to do.”

The News’ analysis showed that the composite score for all large Texas hospitals improved from 2007 to 2009 by 41 percent. For the same period, Parkland’s composite score improved 33 percent, indicating that statistically it did not keep pace with its peers. Parkland did improve its relative ranking on the composite score from worst among the state’s large hospitals in 2007 to fourth worst in 2009.

St. Paul also scored poorly on accidental punctures and lacerations. The analysis identified 103 potential cases, a risk-adjusted rate of 6.8 per 1,000 patients — third-worst for a large hospital in Texas in 2009.

“What you’re going to hear from me is, like any other health system, we have issues that we have to work on,” said Reed, the chief quality officer for UTSW hospitals.

Reed said that while the overlap of surgical staffs at Parkland and St. Paul could contribute to both hospitals’ poor performance on several of the patient safety indicators, another explanation might be how those hospitals code their care.

Reed also said other quality measures, especially risk-adjusted mortality rates, are necessary for a full understanding of a hospital’s care. Such rates count the number of patients with certain conditions who die during or shortly after hospitalization but don’t focus on medical error.

Dr. John Jay Shannon, Parkland’s chief medical officer, said he believes mortality rates are even more important than PSIs. “We do quite well there,” he said. “I’d like to go to a place where I’m less likely to die.”

Two of the PSIs measure mortality. In 2009, Parkland had the 10th-worst risk-adjusted rate on deaths among surgical inpatients with serious treatable complications. Its performance was average on deaths among patients considered unlikely to die.

The News used the PSIs to benchmark hospitals on the issue of preventable harm after uncovering several cases in which Parkland patients suffered death or disfigurement as a result of their care.

Internally, Parkland’s managers have embraced the use of PSIs. The indicators “utilize readily available administrative claims data such as coding information to better understand patient safety performance at hospitals,” Ramirez wrote in a 2010 report to the hospital board.

One of Parkland’s “top priorities for improvement” was in the PSI for accidental punctures and lacerations, the report said, which offered a “mixed opportunity” for improving both clinical care and record-keeping.

“Everybody who works in clinical medicine is wary about the limitations that are in administrative data,” Shannon said. “It’s just another part of why we use these things as more signal or prompt to do further investigation.”

Baylor Health Care System’s vice president of patient safety said he thought the PSIs were a “relatively weak” way to compare patient safety at different hospitals. “Nevertheless, this method has some scientific validity and is more accessible than would be a clinically based method,” said Dr. Donald Kennerly.

PSIs are “a recognized way to look at things” despite their shortcomings, said Dr. Mark Lester, interim chief quality officer for Texas Health Resources, which operates six large hospitals in the Dallas area.

“You’ve gone to the state public database. You’ve looked at a recognized indicator developed by the federal government,” said Lester, a neurosurgeon, after reviewing The News’ findings. “If you’re trying to get a look in the only way you can, I can’t fault what you’re doing.”

Transparency, safety

For the same reasons The News chose to analyze Texas hospitals’ performance on the patient safety indicators, the public will soon see a lot more of them.

For the first time ever, the federal government has started reporting performance on some PSIs on its Hospital Compare website. The analysis released Thursday covers Medicare patients treated from October 2008 through June 2010.

The federal report found that Parkland and St. Paul were among seven Texas hospitals that trailed the national average on composite scores. In accidental punctures and lacerations, St. Paul and Parkland had the 16th- and 18th-worst rates, respectively, of more than 2,700 hospitals nationally. Two hospitals that performed poorly in the newspaper’s analysis had average composite scores in the government study: Methodist Charlton and Baylor-Garland.

In 2013, the Texas Department of State Health Services, which provided the data studied by The News, plans to begin releasing its own analysis of hospital performance based on the PSIs.

The agency said it had not yet publicized hospital PSI scores because it wanted to wait for data reflecting whether certain conditions were present when a patient was hospitalized. Hospitals in Texas started reporting these data to the state this year.

Some PSI rates, for conditions such as bedsores and postoperative blood clots, can
change significantly when those data are available. Hospitals’ relative scores change if they admit unusual numbers of patients who already have such problems.

One such case may be Methodist Charlton, which a spokeswoman said admits a large number of patients with bedsores from nearby nursing homes.

On advice from researchers, The News tested its conclusions by recalculating composite scores without using bedsore and blood-clot results. While St. Paul, Parkland and John Peter Smith hospitals remained among the state’s very worst, Methodist Charlton and Baylor- Garland improved their rankings slightly.

Consumer advocates say they hope the use of PSI data will be the beginning of greater transparency in health care. Medical researchers say there may be a link between transparency and safety.

“My general experience has been that hospitals that fail to be transparent are usually not committed to improving,” said
Dr. Ashish Jha, a practicing physician and associate professor at Harvard University who has long studied hospital performance and patient safety.

Transparency can also prompt improvement, as shown by successful efforts in several states to fight hospital acquired infections.

“In my experience, public reporting gets the attention of hospitals whose rates are high,” said Dr. Peter Pronovost, a leading national researcher on patient safety at Johns Hopkins University School of Medicine. “Without reporting, many hospitals do not give these infections the attention they deserve.”

Pronovost leads a national project on reducing infections in which both UTSW and Parkland participate.

Paul Levy, former CEO of Harvard’s Beth Israel Deaconess Medical Center, was one of the first hospital administrators to publish patient outcome data online.

“Doctors and others pledge to do no harm. How can they be sure they are living by that oath if they are unwilling to acknowledge how well they are actually doing the job?” Levy said. “The failure by a hospital to publish sufficient data to allow the community to hold it accountable indicates a fundamental distrust in the public and disdain for its right to know.”