When Donald Trump takes over the federal government on January 21, his administration will also gain complete control over much of the .gov suite of websites, which currently hosts a treasure trove of publicly available, taxpayer-funded scientific research. The academic world is bracing itself: Will that data remain available after his transition?
Scientists and university professors all around the country and in Canada believe we're about to see widespread whitewashing and redaction of already published, publicly available taxpayer-funded scientific research, databases, and interactive tools, such as the National Oceanic and Atmospheric Administration's Sea Level Rise viewer, NASA's suite of climate change apps, and the Environmental Protection Agency's maps of the country's worst polluters. They also expect to see censorship, misrepresentation, and minimization of new government-funded research, specifically regarding climate change.
These fears are not based merely on a sense of dreading-the-worst from a man who has called climate change a Chinese hoax, nominated a climate change denier with close ties to the fossil fuel industry as head of the EPA, the CEO of ExxonMobil as Secretary of State, and will reportedly name the fossil fuel-friendly Rick Perry as Secretary of Energy. During the George W. Bush administration, which similarly denied that climate change is being caused by humans, there was widespread censorship and destruction of public-facing climate change information and research.
"Policies and practices have increasingly restricted the flow of scientific information emerging from publicly-funded climate change research," a 138-page report published in March 2007 by the Government Accountability Project begins. "This has affected the media's ability to report on the science, public officials' capacity to respond with appropriate policies, and the public's grasp of an environmental issue with profound consequences for our future."
The investigation found that the Bush administration systematically changed scientists' press releases, misrepresented scientific findings to Congress, and neglected or deleted information on government websites.
For instance, the State Department "retired" climate change from its "global issues" section, the National Oceanic and Atmospheric Administration delayed the publication of climate change data because of "White House concern about the subject's political sensitivity," and the Environmental Protection Agency's pages on global warming and global climate change research stopped being updated shortly after Bush took office. A high-ranking scientist at NOAA told the report's author that "anything dealing with climate change had to be pre-approved at the White House level," including the content on his laboratory's website. Bush also shut down the EPA's research libraries, an action that triggered an investigation from the Government Accountability Office.
It wasn't just Bush—anti-environment politicians such as Stephen Harper in Canada and Wisconsin Gov. Scott Walker have muzzled and censored scientists, as well as made government data less accessible from websites. Journalist James Rowen has been keeping an exhaustive blog about Walker's war on the environment.
"It's prudent for folks to download the science that's easily available now"
With the Trump presidency looming, many scientists who studied Bush's policies are starting a mad dash to preserve climate science that has been made available under President Obama based on fears that it might no longer be publicly accessible. Several professors I spoke to say that officials who work for the government's science departments are privately imploring researchers outside the government to download what they can now, or risk losing access to it later. NOAA and the EPA did not respond to a request for comment. A spokesperson for NASA told me the agency is "apolitical" and that it is "committed to doing whatever we can to assist in making the Executive Branch transition a smooth transition."
If you work for a government agency and have thoughts on the matter or have had discussions about preserving scientific data under Trump, email me: email@example.com, or contact me securely on Signal: 301-412-7324 or SecureDrop.
Scientists who don't have to worry about upsetting their future bosses, however, tell a very different story.
"My expectation and fear is we are going to see round two of Bush," Robert Paterson, co-director of the Urban Information Lab at the University of Texas's School of Architecture, told me. Paterson posted about his concerns on a Facebook group for professors called Planners 2040 earlier this month. "The appointments are hostile to climate change, so I think it's prudent for folks to download the science that's easily available now, because you may have to file a [Freedom of Information request] later to get it."
While it's easy to scrape an HTML website, Paterson and others are worried that, for instance, a NOAA database and tool regularly used by city planners to calculate sea level rise could be pulled offline.
"It's less the documents, which we can get through alternative means," he said. "The bigger issue in my mind is the access to databases and analytic software that public dollars paid for which by administrative fiat they may remove. I use the NOAA sea level rise projection database for discussion in my environmental impact assessment class. I use the greenhouse gas emission calculator for analysis of major federal climate actions."
One of the main concerns is that a Trump presidency doesn't even have to purposefully take down these tools—many of them will simply break or become useless without being regularly updated.
"While we may not see the straightforward deleting of data, we expect to see access to data starved out," Michelle Murphy and Patrick Keilty, who are spearheading a "Guerrilla Archiving" event at the University of Toronto, told me in an email. "It takes effort and money to keep databases and portals updated and maintained, and to make them publicly available. Moreover, data can move from being publicly shared through portals that make it immediately accessible to less accessible, but still technically public forms of availability."
In addition to the guerrilla archiving initiative, researchers at Carnegie Mellon, the University of California, Riverside, and the University of Pennsylvania are working on pre-Trump hackathons and other data retention plans. Professors at each of those universities expressed similar concerns, and said they would be working together to make sure that their projects didn't overlap. At Carnegie Mellon, for instance, associate professor Chris Labash is heading up a series of projects that will determine just how much data is lost under Trump.
"The ephemerality of web content is something people don't think about until there's a catalyzing event"
"It's a bundle of research projects including looking at the tone, tenor, and trustworthiness of .gov communications, the topology of information, the commitment to information sharing with the public, what happens to information when an agency, department, or initiative is eliminated, and many other projects," Labash told me of the work.
The most important and long-running effort to compile a complete picture of the government's websites as they stand today, however, is a joint project between the Internet Archive, the Library of Congress, and several other universities around the country. Called the "End of Term Web Archive," the project compiled full snapshots of government websites at the end of the 2008 and 2012 terms. This project, unfortunately, likely won't be able to capture many of the databases that professors are worried about losing. It will, however, be the most complete archive of government websites as they stand today.
"We basically crawl as much as we can—.gov, .mil, government websites that are not .gov, and social media accounts," Jefferson Bailey, director of web archiving at the Internet Archive, told me. "We collect a lot of government web content in our regular course of activities, but we put a lot more resources into acquiring these resources before and after transition. It allows researchers to perform a longitudinal analysis of what disappears and how much content is on any given site."
The team recently started crawling as many sites as possible—hundreds of millions of unique URLs—and will continue to crawl for two months after the inauguration. In 2008, it collected roughly 16 terabytes of information; in 2012, it collected about 21 terabytes. This year, it's expecting the entirety of the government's websites to be between 30 and 40 terabytes.
"We're doing this as part of our own institutional goals," Abigail Grotke, who is heading up the project for the Library of Congress, told me. "There are a lot of things going on in the government, and we want to take a snapshot of that."
While the End of Term archive is a routine, nonpartisan project, both Grotke and Bailey say that there has been an increased sense of urgency this year as academics have expressed their concerns about the potential for data destruction under Trump.
"We've definitely seen much more attention to the issue with this transition than previous ones," Bailey said. "The ephemerality of web content is something people don't think about until there's a catalyzing factor or event."
The End of Term archive is taking suggestions for specific government websites that should be crawled before the transition. You can submit URLs here.