The National Research Council’s report, Biotechnology Research in an Age of Terrorism provides basic guidelines for the future of safe biotechnology research. It recommends educating researchers on the potential misuse of their biotechnology research (see footnote below), creating international organizations and standards (i.e. an IAEA of biotechnology), requiring oversight with “experiments of concern”, and reviewing the scientists’ publications. Since biotechnology research is “dual-use”, the report acknowledges that implemented regulations cannot hinder the advances that benefit society, particularly in relation to health.
However while it acknowledges the impacts biotechnology has on society, the report forgets the converse. As we discussed with McKenzie’s work on missile accuracy, societal factors influence the way research is conducted. Biotechnology is no exception. Biotechnology research requires a team working through trials and errors within an institutional framework. Therefore, I think the focus on the publication of sensitive information as a security without full consideration of what actually went on in the lab is founded on some incorrect assumptions.
For instance, when providing examples of potentially dangerous publications, the report references the synthesis of the poliovirus genome in the Wimmer lab. It asserts that the scientists used “standard and quite simple procedures for incorporating the IL-4 gene in to the mousepox genome” and because the methods were so standard, the publication of their process provides a “blueprint for terrorists.”
Kathleen Vogel’s article, Framing biosecurity: an alternative to the biotech revolution model?, pushes against the straightforward nature of the procedures. She notes that the results hinged on knowledge gained from years of research, and practices that the lab itself had developed. She concludes that the Wimmer experiment was “not based on cutting edge technologies, but was rooted in more evolutionary and well established laboratory practices and techniques” (Vogel). In other words, synthesizing a polio virus is not as cut and dry as the report suggests. The technological breakthrough was a product of great research and years of experience.
In my opinion, this means that the report’s recommendation to limit scientific publications, even on a level of self-governance. Because of the institutional knowledge required, we should worry less about what specific information is made public. For similar reasons, it’s hard for me to imagine ‘amateurs’ reading a paper which discusses the methods of how to synthesize a polio vaccine and then having a lethal garage-made virus the next day. In terms of bioterrorism, I would be more worried about researchers taking their experience and “going rogue”. To curb this in the future, I don’t see it being unlikely for researchers to be required to have credentials or clearances to work with certain materials.
Do you think the omission (censorship?) of certain methods is an effective tool to manage dual-use research in relation to the other recommendations? If so, who decides how the methods should be edited? The scientists? The publishers? The government? Furthermore, is there anything that you would change about the report’s recommendations? — Tori
Footnote: I believe these discussions about the potential hazards of research also include the accidental release of biotechnology. For those interests, in the field of synthetic biology, suggestions of safeguards include working with auxotrophic organisms and gene-flow barriers.