February 2020. I was still living in Northern California. That month two adults came down with a debilitating flu bug. Not only did this virus leave both weakened for a couple of weeks but gave them excrutiating headaches for the first few days they were sick. Neither of their kids got sick. (Surprising since they were the always the first to come down with whatever was going around.) Me? I remember having some congestion for a day or two. Nothing out of the ordinary. Three days later I was back to my daily Power Walks and taking care of the other adult.
Shortly thereafter we were to learn there was a new bug in town – one that would be labeled “highly contagious“. (This is a direct quote from the one adult who worked at the nearby hospital. She’s the Vice President of a department whose function is to collect, analyze, and monitor diseases and viruses that are currently circulating in the area so the data could then be shared with other hospitals around the country.)
Two weeks later, in the middle of March, the whole country went mad and imposed a “lockdown”. This new bug had a name: Covid-19. The symptoms that were published early on indicated that in addition to knocking the shit out of a person (for weeks) it also produced horrible headaches. My guess is we were all exposed to The Jinn back in February of 2020.
So why, in spite of my being under a great deal of psychological and emotional stress, was I the one adult exposed to the bug that didn’t fall ill? My guess is that (1) I didn’t have Covid-19 and instead caught a mild cold or (2) Gave Covid-19 the boot straight away. Either way I give my healthy immune system a big high five!
Speaking of highly contagious…
Published: May 23, 2012
Bird-flu research: The biosecurity oversight
The packages that started arriving by FedEx on 12 October last year came with strict instructions: protect the information within and destroy it after review. Inside were two manuscripts showing how the deadly H5N1 avian influenza virus could be made to transmit between mammals. The recipients of these packages — eight members of the US National Science Advisory Board for Biosecurity (NSABB) — faced the unenviable task of deciding whether the research was safe to publish.
The group deliberated. Soon, the rest of the NSABB’s 22 voting members and two dozen non-voting members and advisers were drawn in. For five-and-a-half weeks, they pored over the data in the papers, weighing the benefits of sharing the information against the risk that doing so might lead to the accidental or intentional release of a lethal new virus. They exchanged views in hundreds of e-mails and in more than 24 hours of teleconference calls.
On 21 November, the NSABB recommended that journals should redact the papers, publishing their conclusions but sharing methods and data only with approved scientists and health officials. It was the first time that the board had recommended any such restriction since it was convened in 2005, and it sparked a global debate — aired in journals, meetings, blogs and newspapers — that is still raging and has left the US government in an awkward spot. “The United States funded this research and then wanted to censor it,” says David Fidler, who teaches international law at Indiana University Bloomington. “This looked dysfunctional.”
Throughout these turbulent months, the spotlight has shone as much on the NSABB as it has on the mutant flu viruses. The board’s members, with backgrounds ranging from biology to medicine to national security and law, have been developing guidelines for biosecurity oversight for nearly seven years. The flu research was a major test of the principles they had been espousing.
By all appearances, the board struggled. By mid-February, the NSABB was under pressure to overturn its initial assessment. And in the last days of March, it did — voting unanimously in favour of full publication for one paper, which appeared early this month. The board also recommended that the second paper be published, but six members dissented, arguing that the work still posed significant concerns. (That paper’s publication is expected within weeks.) The whole episode has left many people with questions. Could the board have done better? Why wasn’t the research flagged earlier? And is there a way to publish sensitive information while minimizing risks?
There is one point of agreement, says David Relman, a microbiologist at Stanford University in California and member of the NSABB: “This is not the way any of us wants to see these issues discussed, that is, at the eleventh hour and fifty-ninth minute.”
The NSABB’s roots can be traced back to October 2001, when letters carrying anthrax spores were sent to several public figures around the country (see ‘Threat and response’). In response, the US government invested billions of dollars to prepare for future acts of bioterror, much of it channelled into pathogen research through the National Institute of Allergy and Infectious Diseases (NIAID) in Bethesda, Maryland. In parallel, Congress asked the National Academies to form a panel to recommend how dual-use research — work that could carry bioterror risks as well as benefits — should be identified, regulated and reported. Scientists were anxious to show that they could police their own work and avoid heavy-handed or cumbersome regulation from above. “The science community ought to come up with a process before the public demands the government do it for them,” warned Parney Albright of the US Department of Homeland Security in 2003.
Geneticist Gerald Fink at the Massachusetts Institute of Technology in Cambridge was chosen to chair the panel. The recommendations in the resulting ‘Fink report’, published in 2004, set out seven ‘deadly sins’: types of research that should warrant close scrutiny, such as experiments to render a vaccine ineffective or to make a pathogen more virulent. The report also called for the creation of a national advisory board to further explore the issues on a national and international stage. This would become the NSABB, an independent panel that is managed and supported by the National Institutes of Health (NIH). In June 2005, NIH director Elias Zerhouni swore in 23 NSABB members in Bethesda. Paul Keim, a microbiologist at Northern Arizona University in Flagstaff and acting chair of the NSABB, says that the ceremony involved the raising of hands. “We all kept from giggling,” he says.
Right away, the board started to flesh out guidelines for a US policy on dual-use research. Its flagship document, released in 2007 and building on the Fink report, emphasized local self-governance, suggesting, for example, that investigators monitor their own and colleagues’ projects, possibly with the help of existing institutional biosafety committees.
Although not officially part of the board’s remit, the NIH also called on the NSABB to review the occasional paper that raised biosecurity concerns. The first two to land in the board’s lap, in 2005, dealt with efforts to resurrect the Spanish flu virus that was responsible for millions of deaths immediately after the First World War. The board recommended that the papers be published in full. Keim says he now wishes that the group had had more time to deliberate over the Spanish flu work, which raised many of the same issues as the current debate. “I guess I have some regrets about that decision because of the impact it would have had on policy,” he says.
“ The United states funded this research and then wanted to censor it. ”
Nevertheless, the papers the board received last October were different from those it had handled before. Their roots go back to 1997, when H5N1 started devastating bird populations worldwide and health officials voiced alarm about the catastrophe that could ensue if the disease gained the ability to jump between humans. In 2006, the NIH convened a blue-ribbon panel to identify priority research on avian influenza. Among other projects, it highlighted the need for experiments to see how bird flu might evolve the ability to spread from person to person. Soon after, the NIH commissioned and funded several such projects, including one from Ron Fouchier at the Erasmus Medical Center in Rotterdam, the Netherlands, and one from Yoshihiro Kawaoka at the University of Wisconsin-Madison and the University of Tokyo. Robert Webster, a virologist at St Jude Children’s Research Hospital in Memphis, Tennessee, and a member of the blue-ribbon panel, says that it paid close attention to the stringent biosafety requirements of such work, but that dual-use concerns “didn’t really surface”.
They should have, says Keim. The experiments committed at least two of the Fink report’s deadly sins: they deliberately changed the host range of a pathogen and they increased its transmissibility. “You think about adapting H5N1 to mammals,” Keim says, and you quickly “realize that there is the potential to do something very dangerous”.
Concerns surfaced in September 2011, when Fouchier presented his results at a high-profile meeting in Malta. He described, in ominous terms, how he had mutated wild H5N1 virus to make it more likely to infect human cells. He had then let the virus evolve in ferrets, a good model for human transmission, until it was able to spread through the air by a cough or a sneeze. Kawaoka took a different approach, mutating a single gene from H5N1 and plugging it into a less pathogenic viral genome. What resulted — two influenza viruses that could spread in mammals, that most humans had never been exposed to and that stemmed from a virus with the potential to kill — was worrying.
Still, the board struggled with its decision. At first, says NSABB member Arturo Casadevall, a microbiologist at the Albert Einstein College of Medicine in New York, “I was very uncomfortable with the idea of redacting information because I think that it’s a slippery slope”. But the data and expert analysis assembled by the board convinced him that what Fouchier and Kawaoka had done was too easy to repeat. “We just didn’t think it would be a good idea to put a recipe out there,” he says. Michael Osterholm, a public-health researcher at the University of Minnesota in Minneapolis, emphasized his support for the research, but stressed the precautionary principle. Once the work was published, it could not be taken back. “You can’t unring a bell,” he said on several occasions.
In late December, the US Department of Health and Human Services, which oversees the NIH, announced that it would follow the NSABB’s advice. The response was severe, says Keim. “That redaction approach has been universally panned,” he says. “The investigators hated it, the people who weren’t going to get the data hated it. The government hated it because they couldn’t figure out how to do it.”
Meanwhile, the NSABB’s members were scrambling to make clear that the issues needed international discussion. In mid-February, Kawaoka and Fouchier presented their work at a closed meeting at the World Health Organization (WHO) in Geneva, Switzerland. They assured the researchers that the benefits — for monitoring wild viruses for potentially dangerous mutations and for vaccine development — outweighed the risks. They also explained that the mutant viruses weren’t necessarily lethal to the ferrets, something that hadn’t been clear to everyone before. The attendees, mostly academic flu researchers, recommended that both papers be published in full.
In light of the new information, the NIH asked the NSABB to reconsider its position. A workshop was scheduled for 29–30 March.
The meeting started at 7 a.m. in a sixth-floor conference room of building 31 on the NIH campus in Bethesda. Keim had heard the presentations at Geneva, but still couldn’t predict how the rest of the board were going to react. “I was not placing bets either way.” The voting members sat round a conference table, with about 60 administrators, government officials and ex officio members looking on. Everyone was given two hours in silence to review revised manuscripts from Kawaoka’s and Fouchier’s teams. The researchers had edited the papers to clarify the benefits of the research and to explain the safety measures taken during work with the viruses. Later, they gave presentations. Fouchier was reportedly questioned for two hours.
By this point it was clear that Kawaoka’s paper posed less of a threat than Fouchier’s because of the low pathogenicity of his hybrid virus. But Relman and other members of the NSABB say that they were not reassured by Fouchier or by the revisions to his manuscript. “There were no new data that for me diminished the evidence for mammal-to-mammal transmissibility and no data that convinced me that the virulence was any less in his mutant viruses than it was in the wild-type parental H5N1 strains,” Relman says.
The board also heard that the practical and political barriers to redaction looked formidable. NIH director Francis Collins told them that export-control rules and freedom-of-information laws in other countries would make it impossible to implement a system for selectively releasing data quickly. Moreover, such a system could jeopardize the pandemic-influenza preparedness framework, an international agreement to share influenza viral samples and information that had been hammered out in 2011 by the WHO after years of debate. For officials in countries such as Indonesia, where poultry farmers have faced financial ruin because of H5N1, a decision to redact information sounded like a decision to withhold it. It became clear to the board that redaction was effectively off the table, meaning that the NSABB could vote to publish the paper in full, or not at all.
After a full day of briefings and another of deliberation, the board voted. The members present at the meeting unanimously recommended publication of the Kawaoka paper and voted 12–6 in favour of publishing Fouchier’s.
Few came out of the meeting happy. Some were still unsure about how dangerous Fouchier’s virus really was. “Even the 12 who voted in favour of publication were uneasy about this uncertainty in the virus,” says Keim, who declined to reveal his vote. Relman, who voted against publication, says that the process felt unbalanced and that he didn’t have enough time to assess some new data presented there that had not yet been peer reviewed. “I do think questions should be asked about the manner and process by which we were asked to perform this reassessment,” he says.
“ This is not the way any of us wants to see these issues discussed. ”
Osterholm asked some of these questions in a sharply worded letter to the NSABB and Amy Patterson, the board’s director at the NIH, a week and a half after the meeting. (The letter was leaked to Science and Nature days later.) In it, Osterholm said that the presentations given at the meeting were one-sided and designed to favour full publication of the articles. He said that Fouchier had revealed at the meeting an additional mutation that makes H5N1 both transmissible through the air and deadly. This work “surely must be considered as a candidate for the next manuscript to be before the NSABB for review”, wrote Osterholm, who worried that all the same problems would come up again. In her response to the letter, Patterson respectfully disagreed with Osterholm’s complaints. But by this point, the spat had started to attract the attention of law-makers. Congressman Jim Sensenbrenner (Republican, Wisconsin) wrote letters to the NIH and to the White House asking how decisions about the research were reached.
People within the NSABB, and outside it, now say that the board did its best in a highly complex situation. But many point a finger at a flawed mechanism for identifying and dealing with dual-use research. “Almost at every step the system isn’t working very well for these projects that raise serious concerns about biosecurity,” Fidler says. The most pressing question is why the research wasn’t flagged up earlier for scrutiny.
The answer: the policy simply wasn’t in place. In its 2007 report, the NSABB recommended that the federal government develop guidelines and implement a code of conduct to help institutions and researchers to report potential risks at the earliest stage of project development. It also recommended the development of strategies for communicating sensitive research, including restricted publication. These recommendations went largely unheeded because scientists resisted the introduction of cumbersome new practices. “We got all worried about the possibility of these threats,” says Fidler, but when it came to imposing regulations on research, “we tended to back off”.
Now, the flu controversy has forced the US government’s hand. On 29 March, while the NSABB was being briefed, it released a policy that requires federal agencies to identify and monitor research projects they fund that tick boxes on the ‘deadly sins’ list. Tom Inglesby, who directs the Center for Biosecurity of UPMC in Baltimore, Maryland, welcomes the new policy. “It would be much more preferable for these decisions to go on at the beginning of this experimental process. It’s more fair to the scientists, more fair to their institutions, more fair to the journals and more fair to the NSABB,” he says.
Keim, however, points out that the policy does not require review by disinterested parties. “These are decisions that need to be made in the open with input from different segments of our society,” he says. It may be too much to expect scientists to coolly evaluate the risks of their own research against the benefits they gain personally from publication. And even if regulatory changes do take root in the United States, international agreement will take years to solidify. Keim and several others at the NSABB say that publishing with controlled access to certain data would still have been the preferred option for the H5N1 papers, but the challenges extend well past US borders.
Most observers and participants expect that the NSABB will continue to weigh in on policy development, although it may have to resolve questions about conflicts of interest first. In the wake of the flu controversy, some observers have questioned whether it is appropriate to have the NSABB under the control of the NIH — which funded the flu research — and populated by NIH-funded scientists. Board members might not have wanted to vote against publication if it risked biting the hand that feeds them. “I’d be lying if I didn’t say that that thought crossed my mind,” says Michael Imperiale, a virologist at the University of Michigan in Ann Arbor and a member of the board since its inception. Ultimately, he says, he followed his conscience, which favoured publication of both articles. Anthony Fauci, director of the NIAID and a non-voting member of the NSABB, calls the idea of the NIAID taking revenge against NSABB members for their vote “preposterous”.
The whole controversy has been an ordeal for those involved. But Casadevall takes a positive view. “The end result has been a tremendous education,” he says.
“I don’t know how much of a silver lining that is,” Fidler says. There’s little consensus as to what a new system for dual-use research oversight should look like, he says, and governments have simply kicked the can down the road in the past. “That may happen again, but at least it’s out in the open,” he says.
More articles of interest:
“Dual Use Research of Concern (DURC) is life sciences research that, based on current understanding, can be reasonably anticipated to provide knowledge, information, products, or technologies that could be directly misapplied to pose a significant threat with broad potential consequences to public health and safety, agricultural crops and other plants, animals, the environment, materiel, or national security. The United States Government’s oversight of DURC is aimed at preserving the benefits of life sciences research while minimizing the risk of misuse of the knowledge, information, products, or technologies provided by such research.”
“Certain gain-of-function (GOF) studies with the potential to enhance the pathogenicity or transmissibility of potential pandemic pathogens have raised biosafety and biosecurity concerns, including the potential dual use risks associated with the misuse of the information or products resulting from such research.”