On Biotechnology Research

The National Research Council’s report, Biotechnology Research in an Age of Terrorism provides basic guidelines for the future of safe biotechnology research. It recommends educating researchers on the potential misuse of their biotechnology research (see footnote below), creating international organizations and standards (i.e. an IAEA of biotechnology), requiring oversight with “experiments of concern”, and reviewing the scientists’ publications. Since biotechnology research is “dual-use”, the report acknowledges that implemented regulations cannot hinder the advances that benefit society, particularly in relation to health.

However while it acknowledges the impacts biotechnology has on society, the report forgets the converse. As we discussed with McKenzie’s work on missile accuracy, societal factors influence the way research is conducted. Biotechnology is no exception. Biotechnology research requires a team working through trials and errors within an institutional framework. Therefore, I think the focus on the publication of sensitive information as a security without full consideration of what actually went on in the lab is founded on some incorrect assumptions.

For instance, when providing examples of potentially dangerous publications, the report references the synthesis of the poliovirus genome in the Wimmer lab. It asserts that the scientists used “standard and quite simple procedures for incorporating the IL-4 gene in to the mousepox genome” and because the methods were so standard, the publication of their process provides a “blueprint for terrorists.”

Kathleen Vogel’s article, Framing biosecurity: an alternative to the biotech revolution model?, pushes against the straightforward nature of the procedures. She notes that the results hinged on knowledge gained from years of research, and practices that the lab itself had developed. She concludes that the Wimmer experiment was “not based on cutting edge technologies, but was rooted in more evolutionary and well established laboratory practices and techniques” (Vogel). In other words, synthesizing a polio virus is not as cut and dry as the report suggests. The technological breakthrough was a product of great research and years of experience.

In my opinion, this means that the report’s recommendation to limit scientific publications, even on a level of self-governance. Because of the institutional knowledge required, we should worry less about what specific information is made public. For similar reasons, it’s hard for me to imagine ‘amateurs’ reading a paper which discusses the methods of how to synthesize a polio vaccine and then having a lethal garage-made virus the next day. In terms of bioterrorism, I would be more worried about researchers taking their experience and “going rogue”. To curb this in the future, I don’t see it being unlikely for researchers to be required to have credentials or clearances to work with certain materials.

Do you think the omission (censorship?) of certain methods is an effective tool to manage dual-use research in relation to the other recommendations? If so, who decides how the methods should be edited? The scientists? The publishers? The government? Furthermore, is there anything that you would change about the report’s recommendations? — Tori

Footnote: I believe these discussions about the potential hazards of research also include the accidental release of biotechnology. For those interests, in the field of synthetic biology, suggestions of safeguards include working with auxotrophic organisms and gene-flow barriers.

5 thoughts on “On Biotechnology Research

  1. Tori, I think you made a good point. While these biological papers (ex. Wimmer’s study) make their findings sound straightforward and easy to emulate, they are actually more complex. As the Aum Shinrikyo article points out, there is a necessary distinction between explicit (book) and tacit (hands-on) knowledge (p. 33). Biological weapons require more tacit knowledge. That is, developers need a certain level of expertise and know-how. This is why it was so difficult for Aum Shinrikyo to produce a viable bioweapon.

    Although I acknowledge and understand the dual use implications, I agree that we shouldn’t worry as much about published studies. In a way I think this open source network is beneficial because we can track new discoveries and progress and monitor worrisome/threatening trends.

  2. I also agree that the report’s suggestion to limit publications is
    somewhat unwarranted, though I think we cannot entirely dismiss the
    reasoning behind this recommendation. As Tori and egelb note, in general
    we do not have to be too worried about limiting publications because
    producing a bioweapon requires tacit knowledge far beyond what someone
    could garner from reading a biotechnology article. Moreover, limiting
    publications could have negative effects by hindering scientific
    collaboration that allows for advances.

    Nevertheless, it is important not to downplay how serious of a threat it
    might be if terrorists were in fact able to use certain publications
    for destructive efforts. If terrorist organizations are capable of
    planning attacks as complex as, for example, 9/11, I would not put it
    beyond them to use information in biotechnology articles to their
    advantage in creating weapons. So I do not believe it is unreasonable to

    implement safeguards that would prevent terrorists’ access to sensitive

    biotechnology information. At the same time, legitimate scientists’ access

    to others’ biotecnology research should never be limited.

  3. I think the parallels between biotechnology and nuclear technology are particularly pertinent to your questions here. Herein, as Tori and the commentators before me have mentioned, the distinction between book knowledge and capability is important because research can mostly continue with little immediate national security implications. In both, the amount of information available to us about what is possible is quite large, whereas the technical details on how to create weapons is much harder to come by. Capability is also based on material availability; just as uranium is hardly common place, isolated pathogenic cells are also not readily available.

    No doubt, “dual use” is also part of the lexicon of both nuclear technology and biotechnology. However, I would posit that we are less able to delineate between harmful and harmless use of biotechnology than we are of nuclear technology. Genetic engineering, for example, has obvious legitimate agricultural purposes, but could also be repurposed for bio-weaponry. Moreover, in nuclear technology, very specific capabilities are required in order to enrich uranium to weapons grade, where these thresholds in capability are less evident in biotechnology. Verification is imaginably also very difficult for biotechnology; aerial photographs, for example, would do little to reveal biotechnological capabilities. Many of these activities could be conducted covertly in existing innocuous facilities. As such, I do see the problem of dual use to be more acute in the field of biotechnology.

  4. I’d like to further William’s comments one more step and argue that the issue of information transparency is one that extends far beyond the arena of technology or national security, and is a problem that nations will run into so long as competition factors into having the knowledge in any way. However, I’d agree with Vogel’s article and argue that there is little point in trying to control for the distribution of knowledge, because it is both logistically difficult to do and unbeneficial to all in the long run. I’d argue that ideally, nations would not focus on concealing information and instead focus attention and resources on furthering their own research and improving their own technology, as well as put in the effort to develop defense mechanisms (especially when it comes to biotechnology and potential biowarfare). Although from a practical standpoint this would be difficult to implement (because of the establishment of the everlasting battle between people who create the technology and people who attempt to create technology to overcome it), in practice this would seem to have much more widespread benefits. Not only are we furthering the world’s wealth of knowledge when it comes to technology, but by opening this resevoir of knowledge we are offering ourselves the potential to access to a better quality of life. As noted in Vogel’s article, it is not quite so easy to replicate many methods described in scientific papers, and as such the chances of it being misused would be relatively low. Even if it were to be implemented by other (perhaps not so peaceful) organizations, it would be better for us to be aware of the distinct possibility of the other parties gaining the knowledge (due to it being open sourced) and being aptly prepared for an attack, rather than assume knowledge is kept safe under lock and key and be wholly unprepared if a terrorist group did manage to get their hands on the necessary material and personnel.

  5. I would just like to add a counterpoint to this argument. While I do not believe there should be a curb on what information is presented to the public, remember that a terrorist is not simply an “amateur” reading a paper. A rebellious teenager sitting in his garage might try to create a virus for fun and give up after he realizes what a project it is. A terrorist cell may contain people who have studied science at a graduate or postgraduate level, and have much more patience than an amateur and may be willing to wait for longer periods of time to correctly create a bioweapon. Synthesizing a polio virus may take years of research, but it is important to remember that we are dealing with groups that may be very willing to take years of planning for one swift and potentially extremely devastating strike.