Is transparency always a good thing? EPA weighs controversial new rule.
Loading...
When it comes to the scientific research that policymakers use to set rules, how much transparency is needed? That’s the question that scientists at the Environmental Protection Agency are now grappling with, in the face of a proposal that would require the agency to give preference to studies whose underlying datasets and models are publicly available when drafting environmental and public health regulations.
The proposal, titled “Strengthening Transparency in Regulatory Science,” is a revised version of a rule first suggested in 2018. Since then, it has drawn fire from critics who say that it represents an attempt by the Trump administration to cherry-pick scientific research to suit its political aims. Proponents of the rule see it as a safeguard to ensure trustworthy research.
Many scientists agree that sharing raw data is, in theory, good for science. But, says Wendy Wagner, a professor in the University of Texas School of Law, there are “a lot of steps between that kind of idyllic in the abstract and mandating it as a prerequisite to considering the scientific information.”
Why We Wrote This
Concerns about public trust in scientific expertise abound. Could increased transparency around research promote confidence in science?
If a revised rule proposed last week is finalized, the Environmental Protection Agency could soon change how it uses science.
“Transparency” lies at the heart of the controversial proposal. Initially suggested in 2018, the revised version of “Strengthening Transparency in Regulatory Science” would mean that, when drafting environmental and public health regulations, the EPA would give preference to research studies for which the underlying datasets and models are publicly available. In the previous draft, all aspects of a scientific study had to be publicly available for research to even be considered. A 30-day public comment period opened March 3, and EPA administrators aim to have the rule finalized by May.
Since its conception, the proposed rule has drawn sharp criticism. While supporters assert that it would be a safeguard to ensure trustworthy research, opponents see it as a Trump administration attack on science that co-opts the positive connotations of “transparency” for political aims.
Why We Wrote This
Concerns about public trust in scientific expertise abound. Could increased transparency around research promote confidence in science?
Sharing raw data makes sense “in the abstract,” says Wendy Wagner, a professor in the University of Texas School of Law, who studies use of science by environmental policymakers. But, she says, there are “a lot of steps between that kind of idyllic in the abstract and mandating it as a prerequisite to considering the scientific information.”
Indeed, transparency does hold scientific value. At the same time, with policy around hot-button issues from coronavirus to climate change being guided by scientific research, it’s vital that both policymakers and the public trust the findings. Transparency might play a role in earning that trust.
A matter of trust
Public trust in science is indeed higher than some headlines might lead you to believe. According to a Pew Research Center survey conducted in 2019, 86% of Americans surveyed said they had confidence in scientists to act in the public interest – a number greater than that for most other institutions, and on a par with the military.
Transparency does seem to play a role, as the Pew survey also revealed that 57% of surveyed Americans said that open public access to data and independent committee reviews of research would boost their confidence and sense of trust.
That’s not to say that they are actually interested in parsing through that data. Rather, says Cary Funk, director of science and society research at Pew Research Center, it’s likely an underlying assumption that “when you’re open and transparent, you don’t have anything to hide.”
As an abstract concept, transparency gets at one of the main tenets of science: open communication among researchers in a way that allows them essentially to check each other’s work. But the specifics – especially when sharing datasets with the public – get a bit thornier.
For one thing, not all raw data can be released to the public easily. Some data is trade-secret protected. There’s also an issue of data from study participants, often medical data, that might have too much personally identifiable information and thus requires privacy.
That is an especially challenging aspect for many of the public health studies underpinning landmark EPA regulations, such as air quality standards. Critics were quick to point this out during the initial comment period for the “transparency” rule in 2018, and the revision allows such studies to be included, although weighted with less consideration than those for which the data is freely available.
Starting a “conversation”
Transparency also doesn’t just have to mean dumping it all out there for anyone to parse through, says the University of Texas’ Professor Wagner. In some ways, it might be less helpful for nonscientists to be able to interpret that data.
“You do need to know what you’re doing with your data,” says Dominique Brossard, who co-directs the Science, Media and the Public research group at the University of Wisconsin-Madison. “You need to be trained. You need to be able to actually understand statistics.”
“Because at the end of the day,” she says, “you can make the data tell a lot of things, you know, in a way that it can be massaged to reach a certain conclusion.”
Furthermore, Professor Wagner says, with some of this data, especially with big data and models, you need computer science expertise, resources, and time.
“It becomes even more of a pay-to-play system as a result of that approach to transparency,” she says. “And, when you have all those advantages, you also can do a lot of mischief with datasets.”
So instead, both Professor Wagner and Professor Brossard suggest a different approach to transparency: a conversation. Rather than saying, here’s the data for you to explore on your own, they suggest that more trust will come from explaining the process of the research to the public and stakeholders in a clear, honest way. Walking through the research process, the peer-review process, and explaining how independent reviewers were selected, as well as the problems and uncertainties in the results, may build more trust and confidence.
“In the study of science, one of the big concerns is trust in expertise. And I don’t think the way you get the trust is to throw downloadable models and datasets at people,” Professor Wagner says. “Trust is a process.”