Do penalties curb medical errors? Investigation offers lessons in data wrangling

Author(s)
Published on
July 27, 2016

This project started with a phone call in late 2012.

We had just published a front-page story about a local hospital that had been fined $100,000 by the state for taking the wrong kidney out of an elderly man. It was the classic “wrong side” medical mistake. The left kidney had a cancerous growth but, through a series of mix-ups, a surgeon instead operated on the right side, leading to disastrous consequences for the patient and his wife.

The day after the story ran his wife called and told an interesting tale of how her husband, struggling for several years with only one cancerous kidney, had managed to survive by carefully watching what he ate. She said her husband forgave the surgeon, which turned out to be true.

But, when I finally got him on the phone he said he did not want to participate, despite his wife’s loquacity.

So we killed the story. But upon the man’s death, his wife started calling regularly.

Every few weeks she would want to discuss the mistake that had turned her life upside down and what was being done to fix the system that ultimately caused his premature death.

The California Data Fellowship seemed like the perfect occasion to take a deeper look at the state’s immediate jeopardy penalty program, which started in 2007 after it was voted into existence by the California Legislature in 2006.

I thought that it would be interesting to see if this penalty program was reducing the number of immediate jeopardy penalties — those deemed to cause the death or serious injury of patients — since its inception.

Also, I had a real person to give the topic a human touch. My regular caller decided that she would participate and tell the story of what it is like to live with the consequences of a severe medical error.

Background research quickly showed that, while the state publicized and penalized the worst errors, there are many more “adverse events” that are dealt with administratively and do not get as much attention.

It seemed like a good idea, then, to see if hospitals that have received the high-publicity penalties for major mistakes subsequently report increases or decreases in their overall number of adverse events. In other words, does getting dinged in public cause hospitals to make significant changes that end up leading, years later, to lower rates of all harm to patients?

It seemed like I was on a good track to reaching this goal. After waiting several months, the California Department of Public Health sent over a spreadsheet with about 9,000 adverse events assessed for hospitals from 2007 through most of 2015.

I spent some time after the holidays taking a look at this data. Linking events to penalties was confusing, and I spent some time going back and forth with CDPH trying to understand how the various pieces of what they had sent me fit together.

Of course, data is all about the mash-up, connecting one database to another by a common field so that your analysis can go deeper. My mentor, Paul Overberg, suggested from the start that we take a look at these hospitals in terms of data from CMS’ Hospital Compare or some other source that tracks quality. Plus, I wanted to map these locations, so GPS coordinates are a plus.

Linking things together proved more difficult than I thought it would. The state’s identification numbers didn’t seem to match other spreadsheets that showed location or Medicare provider numbers. But, eventually, it came together. I’ll admit there was some hand-matching of data involved.

Finally, it looked like we were getting somewhere. I had thousands of spots on a Tableau Public map and zoomed in on San Diego County, my coverage area.

Uh oh.

Mousing over the facilities in my community, I could tell that penalties, big ones that had been publicly announced in official state press releases, were missing from this database. I pulled up the state’s press release index and checked a few known penalties in other counties.

More holes.

Obviously, I went back to CDPH to ask for clarification. I still haven’t gotten any. They haven’t confirmed that the original spreadsheet is missing records, nor have they provided a fresh and complete dataset.

I would get an occasional email that said an update was coming, then, when the deadline would pass, silence. Eventually, the communications officer assigned to this case left the agency. Another was assigned and often my emails were ignored.

With the project deadline closing in, we decided to simply use year-by-year state reports to track the overall prevalence of immediate jeopardy errors and build a map of just the nearly 400 immediate jeopardy penalties that have been publicly announced since the program started.

Sure, it’s not as thorough as the deeper dive that would have been possible with the full dataset, but we could still see, based on the state’s own reporting, that the overall number of adverse events had not dipped below 1,000 since the new law took effect nine years ago.

We chased this angle and published two stories and an interactive Tableau Public map online on June 29. By the way, a shout-out here to Tableau Public’s database joining features. If you’re looking to quickly connect two spreadsheets by a common field, it’s very accurate and simple to use.

Sadly, the project did not feature my long-time correspondent who lost her husband to a botched kidney surgery. I called her when we were ready to do an interview and photos, and she backed out. She said she was afraid of retaliation.

Thinking it over, I’d say I have two main lessons to convey from the experience of reporting this project:

1. When you get a big dataset that you’ve been waiting months to receive, find a way to audit it before you invest your time in analysis. Find some way of connecting the data back to the real world, if you can, to make sure it’s complete. Had I done this in January, I might have received an update in time to hit the project deadline.

2. Try to have a backup plan. Back in October of 2015, I had discovered the state’s “fee” reports, which provide overall detail on the penalty system’s performance, and these reports became critical later on in the process. Likewise, I first requested a smaller dataset of just the publicly announced penalties. I still had this in the wings when the larger trove proved spotty. It was still necessary to go back and add in hyperlinks to the actual government records, location data and basic descriptions of each incident, but having a smaller but still decent-sized spreadsheet available allowed the project to move forward.

I’m still waiting to hear back from the state on whether or not The San Diego Union-Tribune will receive the full database. If and when we do, I’ll analyze it and publish a follow up story.

Paul Sisson, a reporter for The San Diego Union-Tribune, reported on the effects of California’s unique administrative penalty program for severe medical errors, a project that was part of the Center for Health Journalism’s California Data Fellowship.

[Photo: Christopher Furlong/Getty Images]