Newswise — Whether we know it or not, the virus that might cause the next major deadly disease epidemic might be circulating in nature right now. It might be spreading among birds, or monkeys, or pigs, and all it will take to spread to humans may be a small random mutation in one of its genes.
If we want to get out ahead of it, or even respond quickly when it emerges, there’s one way to do it: Study potentially dangerous viruses in secure research labs, and make genetic tweaks ourselves to see what changes would put us most at risk.
Besides teaching us important lessons about the virus or other microbe, this could help us be ready to respond to an outbreak and develop new treatments or vaccines without having to start from scratch when the disease is already spreading.
But then again, if we make those tweaks, we could open ourselves up to a human-caused pandemic, if some of the altered microbes escape the lab or are obtained or grown by terrorists or rogue governments.
For the past three years, research like this on viruses with pandemic potential such as avian influenza has all but stopped, as scientists, ethicists and public health experts looked at both sides of the coin.
This week, the green light went back on – but with a ‘caution’ sign and specific requirements put in place via a National Institutes of Health notice to researchers.
University of Michigan virus scientist Michael Imperiale, Ph.D. has been in the thick of the debate since it started more than five years ago. At that time, European and U.S. scientists showed that they could change the virus that causes bird flu slightly, and it would gain the ability to spread between mammals through the air. The ensuing debate over this kind of research, called “gain of function”, led to a ban on NIH funding for it in 2014.
At the time, Imperiale served on the National Science Advisory Board for Biosecurity, which advises the NIH’s policy officials on issues related to the security of biological materials. More recently, he advised the National Academy of Sciences about options for managing the information that such research might produce.
He welcomes the new NIH policy on funding for work on “Enhanced Potential Pandemic Pathogens”, as he did the NSABB report that recommended much of what NIH has decided to do.
“It’s good to lift the moratorium on gain-of-function research,” explains Imperiale, who is the Arthur F. Thurnau Professor of Microbiology and Immunology in the U-M Medical School as well as U-M’s associate vice president for research policy and compliance. “I’ve been really worried that something will pop up in nature, and we’ll be behind the 8 ball.”
When he announced the new NIH policy Monday, director Francis Collins noted that some gain-of-function research projects had actually been allowed to go forward during the last three years, after careful review.
That kind of review for proposed gain-of-function research projects seeking NIH funding now has the force of policy. Imperiale notes that this meshes with what the NSABB recommended last year, and what he and others have said since the start.
“The bottom line is, if it’s an important scientific question, let’s figure out a way to do the work safely,” he says. “We have to look at each project case by case.”
The grant application process is the perfect time to weigh the scientific and public health justification for the work against the potential risks if the altered microbe leaves the lab, he says. That way, the researchers seeking to do the work can make their case and the reviewers can raise concerns or set limits.
But on the other end of the scientific process, when the research has produced results, another thorny issue arises, he notes. He and Stanford University professor David Relman wrote a paper for the National Academy of Sciences earlier this year about what options for controlling such information might exist.
They looked not just at gain-of-function research, but at any scientific work that produces results that could be put to both good and nefarious uses. Their work was prompted in part by a debate over whether and how the bird flu research results could safely be shared with scientists, without tipping off potential terrorists or those who might seek to weaponize the altered virus.
In the end, the NAS consensus report on such “dual use” research concluded – based in part on Imperiale’s input -- that the federal government’s ability to control the dissemination of information from such research is limited, and nonexistent when it comes to research not funded by federal agencies.
Instead, it called for a review process similar to what the NIH has just announced, to examine the potential course of action before research is funded. It also called for more international policymaking, and more education of researchers about biosafety and biosecurity when working with infectious organisms.
In the end, Imperiale says, the new NIH rule will allow researchers at well-equipped facilities with appropriate safety infrastructures to move forward.
As for the potential use by those with evil intent? He notes that it requires a high level of resources to do this kind of work without endangering the researchers themselves. But vigilance against such threats will always be important – which reinforces the need to study these organisms ourselves.
“This has been a good example of the effort to balance scientific and societal needs, but it dragged out for a long time,” says Imperiale, who worries that young scientists are likely shying away from studying these organisms in this way because of the ongoing debate. “We need to be more nimble as a scientific policy community.”