Q&A with Mark Katches: Editor breaks reporting down to the chemical level
Mark Katches is the deputy managing editor for projects at the Milwaukee Journal Sentinel. He leads a team of reporters who have been watch-dogging the use of chemicals in food containers and other products for the past two years. The series, Chemical Fallout, was a finalist for the Pulitzer Prize and it includes an explosive story this month about the FDA working closely with chemical industry lobbyists on bisphenol A standards. The heart of the series has been the Journal Sentinel's own review of chemical safety data and its testing of products. Katches has a history with testing. He headed a team at the Orange County Register that was a finalist for a Pulitzer Prize for a series built on testing candy for lead.
I reached him at his office in Milwaukee. Here is a recap of our conversation. It has been edited for space and clarity.
Q: You spent most of your reporting career doing old-fashioned gumshoe work. Knocking on doors, requesting public documents, persuading staff members for politicians to give you the inside scoop. When did you first work on a story where you thought there could be a more scientific approach?
A: I think it was when you and I were back at Harvard for the Goldsmith Awards when we were finalists for the Body Brokers. We saw what Andy Schneider had done at the Seattle Post-Intelligencer. They had done testing of soil for asbestos. And soon after that, I moved into editing. I thought this is a way you can really add some oomph to stories.
Q: And when did you actually start using it in stories?
A: The first time was on a story on dangers in childcare centers. In one case, there was a child care center based at a church that had been cited by the state for lead paint. They reported a lot of chipped paint. One of the reporters, Jenifer McKim, picked up some paint chips, and we sent those to a lab and found that the paint did indeed contain lead. That was one isolated test at one child care center. And that made me think we could bring some authority to any subject by doing our own testing.
Q: With Toxic Treats, how did you persuade the editors at the Register that paying for testing of candy was a good investment, given that state and federal tests already had shown lead in candy?
A: We conducted more than 400 tests for that story. And to the editors' credit, it wasn't a hard sell. I think the idea was to try to go beyond testing of candy, which the state had done. We wanted to do a broader look at all the candy ingredients. That was one of the key selling points for doing our own testing. We were testing chili from the fields before and after it had been dried. We tested dirt from those fields. We tested wrappers and the clay pots that held some of the candy.
Q: How much did it all cost?
A: We spent a little less than $10,000. We were able to negotiate a pretty good deal with the labs.
Q: When you moved to the Journal Sentinel, did you immediately start talking about testing?
A: Not immediately. But we wanted to do the microwave testing back in 2007. But it was an extraordinary ordeal. We went to a lab that said they could do it, but then it could not meet the standards that the FDA had set. They already had started the testing and had to stop. It took us months to find a place that could do it and then nail down the protocols. It was not until the late summer in 2008 that the testing started in earnest. We ended up spending a little under $5,000 for that story. We also have launched a consumer watchdog team here that has done some interesting testing. They went into bars and got samples of beer from beer taps and had the beer tested for bacteria. At almost every bar we went into the taps were filthy. We tested maybe eight or 10 different beers at different places in the area. We tested sushi for mercury before the New York Times did their big story on sushi and mercury. We tested some toys for lead. We have done a lot of little targeted testing that has not cost much.
Q: How did "Chemical Fallout" start?
A: Susanne Rust is our science writer and has been writing about the dangers of BPA for some time. She expressed interest in doing a larger project when I got here, and the managing editor, George Stanley, was a big advocate for doing more on BPA. When I came on board, I built an investigative team that had diverse backgrounds. I teamed up two of those reporters with Suzanne to pursue the story. And they went after it pretty aggressively for several months.
Q: Your team did something that few mainstream publications have done. You reviewed more than 250 studies and analyzed their results. How did you ensure that you were using the necessary scientific rigor to accurately evaluate that research?
A: Suzanne had the skills. She is an inch away from having a PhD herself. We looked at all studies on bisphenol A over a 20 year period on animals with vertebrae, usually mice, and nearly all the studies had been peer reviewed published studies. Some of the studies that the government had used showing that BPA was safe had never been peer reviewed. Essentially, it was not rocket science. It was a matter of seeing who was funding the studies, seeing what they studied, seeing what levels of BPA were detected and going from there. The big finding was that the vast majority of scientific research has found that BPA is dangerous. The minority of studies that have concluded it was safe were almost exclusively funded by the chemical industry.
Q: Let's talk a little about sample size. For the microwave-safe containers story, you tested 10 products in a lab. All were shown to be leeching chemicals when heated in an oven. That sounds pretty damning. But how did you decide that testing 10 was enough? One could argue that you could test another 90 and see no leeching, which would make the results less compelling.
A: That's a great question. There is no easy answer. We tested each type of product three times. It was a total of 30 samples. In an ideal world with unlimited resources, I'd say, let's test 100 products. The fact is we didn't have the budget to do that. We did talk to a lot of other scientists before we did this, and we asked about sample size and other parts of the protocol until we felt good about the way we were doing things. You have to make sure you have vetted your protocols with people who know what they are doing.
Q: Now, you and I also wasted a little bit of time and money with testing, right? You had the idea that stores were lying about the fat content of hamburger. So I drove around Orange County and bought a bunch of hamburger and found that the fat content numbers were pretty much on the money. That endeavor confirmed my belief that the best story ideas often do not come from editors. But what did you learn from it?
A: I think what I learned is that sometimes you are right and sometimes you are wrong. It wasn't terribly expensive testing, and it wasn't wide scale. If we had gotten any hits we would have tested more, but we didn't, so we dropped the subject. Sometimes when you test things you have to go on a hunch and sometimes the results aren't going to be there. The key lesson is always start with a small group and then branch out.
Q: What can someone at a small news operation do and not break the piggy bank?
A: You might want to hook up with a local university or even some local laboratories that might be willing to defray some of the costs in the spirit of public service. From a consumer watchdog standpoint, if you're trying to do quick stories, you don't have to test a million samples. You can do things that are targeted and inexpensive. If you want to do something more systemic, you probably do have to test more. The Chicago Tribune did some phenomenal work with testing of toys using a hand held testing device. Whatever you do, be transparent with readers. Let them know exactly what you did and why.