Last week when I wrote about the contretemps between the Pew Forum and Catholic Relief Services and U.S.C.C.B. over the precise $ figures for the advocacy work undertaken by the two Catholic entities, I relied on previous Pew statements in my possession to cobble together the forum's response. This week the forum's Erin O'Connell reminded me that I could have gone right to the source for a more detailed response (mea culpa) and this week she provided one via e-mail. (Allen Hertzke, the primary researcher of the study, for his part responded to a withering blogpost from Sister Mary Ann Walsh at the Huffington Post.)
I said last week that Pew was standing by its figures, but O'Connell tells me, well, perhaps not: "We are currently working with both organizations in an effort to arrive at better figures." That effort will include divorcing the costs of U.S.C.C.B./CRS activities that do not pertain to even broad understanding of advocacy that Pew used in its study. Its researchers will be consulting with C.R.S. and U.S.C.C.B. staff to tease out more accurate numbers.
I'll go out on a limb and predict that the figures Pew cited for U.S.C.C.B. and C.R.S. advocacy/lobbying—$27 million and 4.7 million respectively—will come down, perhaps dramatically, once this more detailed analysis is accomplished.
O’Connell points out that part of the reason the Pew figures are so much higher than officials from C.R.S. and the U.S. bishops think they should be is that Pew indeed used a generous understanding of advocacy meant to "encompass a wide range of efforts to shape public policy on religion-related issues. It includes lobbying as strictly defined by the Internal Revenue Service," that is, "attempts to influence, or urge the public to influence, specific [state or federal] legislation…. But it also includes other efforts to affect public policy, such as activities aimed at the White House and federal agencies, litigation designed to advance policy goals, and education or mobilization of religious constituencies on particular issues.”
The bigger problem for Pew researchers was the financial statements they used to come up with their projected totals. They relied on only public documents, not interviews with staff, to produce the numbers. Associate Director for Research at the Pew Research Center's Forum on Religion & Public Life Alan Cooperman strenuously defended that methodology, arguing that public and verifiable data was the correct resource to use. Self-reporting was a non-starter, given the possibility that some religious lobby/advocates might be tempted to inflate their budgets to overstate their K street cred and other groups tempted to use creative accounting to understate their spending to stay under the cultural radar, using publicly available data seemed the best way to proceed, he said.
Since many of the agencies included in the report were clearly engaged only in lobbying and advocacy, their budget numbers were not in dispute. But groups like C.R.S. And the U.S. bishops with budgets that include social service, relief and development efforts, Cooperman allowed, were trickier to parse.
He said the U.S.C.C.B. and C.R.S. have been the only agencies to complain about the numbers Pew cited. He added that he had not anticipated finding himself at odds with both groups, particularly since the report generously included caveats and qualifications regarding the dollar figures it was using. Still those warnings may be not enough to keep some from seizing on the report to depict the bishops as well-financed bullies on the Washington block, and C.R.S. is certainly is not thrilled about leaving its donors with the impression that it maintains a profligate “lobby” budget when their mission is dedicated to disaster relief and development. Perhaps those concerns account for the Catholic agencies’ eagerness to set the record straight on the Pew citations.
Cooperman seems similarly eager to put this minor controversy in the forum’s rear view mirror, and all parties now are hashing out numbers that will be more true to actual advocacy spending. Cooperman said the forum was interested in posting the most accurate and verifiable information as possible and to the extent that further discussion with C.R.S. and the bishops will achieve that goal, Pew researchers were happy to oblige. “We get questions from people and occasionally we get complaints, but we really try to be accurate and try to be transparent as possible,” he said.
In my conversation with him, Cooperman seemed a little frustrated by the inability of today’s headline hungry journalists and bloggers to more carefully read Pew reports, thus missing a parade of qualifications that he thinks should have tempered reporting on that irresistible $27 million figure. Fair enough. I’m wondering if it’s ok to return the favor as a journalist who on occasion does indeed scan a text from a venerable research organ following a presumption that it got the numbers right on the first go around and were not publishing a best estimate.
It’s true that message is discoverable in the fine print, but in these hasty e-days of blog-driven reporting, that fine print can get lost in the bitstream. Beyond that, considering the many cultural antagonists out there eager to find a way to take the U.S. bishops down a peg or two (more), couldn’t someone at Pew have anticipated how that the juicy “lobbying” number was going to play in press coverage?
The desirability of relying on publicly available and verifiable information is not in dispute. And not relying on self-reporting from the agencies themselves certainly makes sense. That said, it’s not the responsibility of the U.S.C.C.B. and C.R.S. to craft financial statements that are easiest for social scientists to parse.
As Cooperman emphasized the report notes that there was much that was unknown and ambiguous about the publicly available date released by the U.S.C.C.B. But there was what I can only decently characterize as an ineffectual attempt by Pew researchers to reach out to these agencies for greater clarity. Clearly even the authors of the study had some doubt about how accurate the high-end figure that was eventually published was going to be, hence all the qualifications. Perhaps that doubt should have been a warning to Pew researchers to review procedures instead of inspiration for crafting better data caveats.
It’s great that Pew researchers are willing to revise the report in consultation with CRS and USCCB staff to come to a calculation of more accurate figures. It would have been best, of course, to have calculated those numbers more definitively in the first place.