The Pandora’s Box of Biology


On December 20, 2011, the press announced that the US government had requested two scientific journals – Science and Nature – to refrain from publishing a full account of an experiment that increased the transmissibility of bird flu virus H5N1.1 Government concerns that bioterrorists might use published data on the experimental details to recreate the deadly virus and unleash a pandemic across the globe motivated this unprecedented action. This is not the first time that scientific breakthroughs have set off alarm bells. Since the anthrax-laced letters of 2001, the US government has been on high alert, issuing regular warnings about the misuse of biotechnology. This anxiety finds its roots in the belief that globalization and the rapid development of biotechnology facilitate access to specialized knowledge, making it easier for terrorists to apply scientific advances to nefarious purposes. Yet, the idea that knowledge created by highly specialized scientists will easily trickle down to “comparatively low-skilled practitioners” via written documents has no solid foundation. Research in the field of science and technology studies has shown that knowledge remains confined to small groups of scientists who created it, because it has a tacit component that cannot readily be transferred to other individuals or locations. Science and weapons developments are also subject to organizational and managerial demands that also affect scientific results. Therefore, access to written information alone does not allow the easy replication of previous work. The question remains: what conditions are required to replicate past work, and can terrorist groups create such conditions?

There is no denying that the field of synthetic biology has experienced tremendous progress in recent years. In the past decade alone, several experiments illustrated the rapid growth of technology and the concomitant decrease in the cost and time required to achieve significant results. For example, in 2001, a team of Australian scientists inadvertantly developed a lethal mousepox virus,2 while, in 2002, a team of virologists at the State University of New York at Stonybrook synthesized the poliovirus using genetic information obtained off the internet, oligonucleotides3 ordered from a commercial company, and common biological research equipment.4 In 2005, researchers at the Center for Disease Control in Atlanta, Georgia, and other scientific institutions reconstructed the complete 1918 flu virus, using gene sequences extracted from the preserved tissue of pandemic victims.5 Thus, by no means is the latest experiment on the H5N1 virus an exceptional illustration of the perceived danger of biotechnology exploited for harmful effect.

Occurring in the wake of the anthrax letters, these events spurred a controversial debate about whether scientific work and publications should be subject to stricter controls. Proponents claim that such work, easily diffused over the internet, produces blueprints that can be used by terrorists or rogue states. Opponents, on the other hand, argue that science should not be censored and favor instead a system of self-regulation.6 But a missing ingredient in this important debate is to what extent these published experiments can be readily reproduced by terrorists and rogue states? Do these experiments rely on highly routinized, standardized, and automated methods that could be replicated even by individuals with limited knowledge? Has science become a “black-box,” transforming complicated learned laboratory techniques into easily and quickly executed routines?

Despite the undeniable technological progress, Post-9/11 biotechnology experiments have only the appearance of simplicity and speed. Both the 1918 and H5N1 flu virus experiments took about a decade of painstaking planning and bench-work. The research also required contributions from teams of scientists with specific skills, working in highly-contained environments due to safety concerns.7 Similarly, the poliovirus synthesis took three years to accomplish, but the work relied on a decade’s research and on the knowledge accumulated by the Stonybrook scientific team during that period.8 The highly publicized two-week production of an infectious bacteriophage by the Venter Institute was also the product of a decade-long research project, involving world experts in DNA synthesis, one of whom had been studying that specific bacteriophage for over 40 years, while another was a Nobel laureate.9

The methods used in these experiments are also only seemingly straightforward, and in some cases, although the processes are described in textbooks and are well known within the scientific community, they are in fact difficult to master and replicate successfully. For example, the poliovirus synthesis hinges upon the production of a cell extract that is used to grow the virus. The common technique used to produce the cell extract consists of manually crushing bovine cells using an instrument resembling a mortar and pestle. In spite of this apparent simplicity, producing a good cell extract may take several months, and even experienced scientists and technicians are not always successful in producing it. Indeed, much of the technique resides in skills or know-how honed over years of experience, allowing scientists to develop a “feel” for how strongly and for how long they must crush the cells. The success of the technique is also dependent on laboratory routines and disciplines that cannot always be transferred to a new location. The composition of the ingredients used – bovine serum and water, for example – vary seasonally and from one location to another, making replication in a different context more challenging, if not impossible. This was illustrated by a Belgian post-doc who had spent six years at the Stonybrook laboratory learning that technique, but then was unable to reproduce his successful results in his laboratory in Belgium. Yet, without a good cell extract the poliovirus synthesis cannot be completed.((Kathleen M. Vogel, Biothreats and Policy Logics (Baltimore, Md.: Johns Hopkins University Press, forthcoming).))

The use of technology that automates processes formerly carried out manually does not guarantee success either. The Polymerase Chain Reaction (PCR) machine is now commonly used to amplify DNA segments, a task that twenty years ago required long and assiduous manual work. Yet, users of the machine indicate that it does not eliminate the need for individual skills and know-how, and often requires frequent cooperation with other scientists to solve problems in using the machine.10

Scientific publications typically do not stipulate the difficulties, mistakes, and failures that scientists endure, nor do they clearly specify how problems have been solved. They only present the successful results in a sanitized manner. The truth is that, in spite of technological progress, scientific work remains the result of the cumulative and cooperative work of teams of scientists whose skills derive from years of experimentation and testing. Unlike in the nuclear field, where the laws of physics introduce a certain level of certainty in experimentation, biology deals with live organisms that are sensitive to the environment, making their behavior difficult to predict. Saying that biology is more of an art than a science is not an over-statement. However, this is an art that requires a certain organizational environment to express itself fully.

The Soviet biological weapons program was by far the largest and the most advanced bioweapons program ever consummated. Including dozens of research institutes, production plants, and test sites, the program developed and produced large quantities of anti-human, anti-plant and anti-animal viruses and bacteria, some genetically engineered to resist antibiotics. This characterization, however, hides an unrecognized aspect of the Soviet program: some of its facilities were unsuccessful in their scientific pursuits in spite of having access to unique expertise, and virtually unlimited financial and material resources. These difficulties were in part due to organizational and managerial factors that had a negative impact on scientific results. For example, facilities that adopted a hierarchical organization and an autocratic type of management were less successful than those that opted for a more inclusive and democratic approach, which encouraged scientists and technicians of various ranks to contribute to the larger goal. In so far as the success of scientific work depends on the ability of scientists and technicians to collaborate, exchange information, and pool their individual expertise to solve problems collectively, autocratic organizations that restrict the flow of information and strictly regulate interactions between individuals prevent the efficient use of existing knowledge, and limit the creation of new knowledge. By contrast, democratic organizations tend to promote information exchange and cooperation, creating the conditions for a more efficient use of knowledge and innovation.11

The impact of organizational factors can also be detected in smaller programs, including terrorist ones. For example, the Japanese terrorist group Aum Shinrikyo, which conducted a chemical attack in the Tokyo metro in 1995, was unable to produce a biological weapon, in spite of having invested about $10 million in its program over a six-year period. The group was hampered from the start by the fact that its members had limited scientific knowledge and no access to virulent strains. Importantly, the group also adopted a hierarchical structure, with decisions being made by Aum founder Shoko Asahara and his inner circle. Decision-making was often premised on the irrational beliefs of this inner-circle, rather than scientific facts. To ensure secrecy, the group also compartmentalized its biological weapons program, restricting access to laboratories and information to a small group of individuals. By doing so, the group dramatically reduced the knowledge base available to carry out their scientific work. As the knowledge base was very limited from the start, the chances of progress were limited.12

Authors of scientific articles include only a portion of the knowledge needed to achieve their reported successes. Missing in such publications is the extensive tacit knowledge, which is learned by doing rather than reading, and which is embedded in the brains of scientists and technicians as well as the routines and disciplines developed within their laboratory.  In order to effectively use such information, an outsider would require a tremendous amount of interpretation. This implies that the new users must possess the base knowledge required to interpret the data. Thus far, terrorists have been unable to gather teams of scientists and technicians that can even remotely approximate such required expertise.

Could terrorists, or even rogue states, successfully replicate past work if they assembled the required team? Past attempts at replicating weapons work, using data obtained from a previous program, or from another facility within the same program, have not fared well. For example, within the Soviet bioweapons program, a Kazakh production plant struggled mightily to produce large quantities of the anthrax weapon designed in a Russian institute. The challenge required five years of additional bench-work and testing, and the contribution of the original authors of the weapon from Russia coming to Kazakhstan, before they succeeded in creating what proved to be a different anthrax weapon.13 The history of “reverse engineering,” or attempts at essentially copying an existing weapon, is replete with examples of the final product exhibiting important differences from the copied sample.14

These examples suggest that documents or items produced within a specific context cannot be easily replicated in a new environment without extensive adaptation. If terrorists or a rogue state were able to assemble the right expertise, and assuming they have the financial and material resources required, their replication efforts would probably face similar challenges, possibly requiring several years of work without assurance of eventual success.

These groups would also need to create a work environment that allows cooperation and information exchange within the scientific team. This is particularly important in a context where scientists must decipher incomplete data, which would benefit from the collective interpretation of information. This kind of organization, however, creates greater risks of detection – more people would be aware of the group’s efforts, increasing the potential for leaks or infiltration. Terrorists have so far preferred an autocratic hierarchical organization, because it better ensures covertness at the cost of the efficient use of knowledge.

Replication of past work using only written documents is fraught with difficulties, even under the best circumstances. It is even more difficult for terrorists with limited scientific expertise and skills, who are highly likely to employ an organizational model that restricts information exchange and cooperation. In this context, the US government’s efforts to restrict publication of the scientific work related to the H5N1 virus transmissibility study is not likely to achieve additional security benefits. Still, the lasting impact of the 9/11 attacks on fostering highly risk-averse threat assessments is likely to endure for the foreseeable future. Nonetheless, it will become increasingly important to avoid focusing primarily on the tangible aspects of weapons proliferation (documents, material, equipment) at the expense of underplaying the value of intangible factors (knowledge creation and transfer and organizational effects).  Knowledge about such intangible factors offers policymakers a richer and more effective range of options to slow down or stop programs of proliferation concern.

Sonia Ben Ouaghram-Gormley is an Assistant Professor in the Biodefense Program at Mason


  1. Denise Grady and William Broad “Seeing Terror Risk, U.S. Asks Journals to Cut Flu Study Facts,” The New York Times, December 20, 2011. []
  2. Jackson R.J, Ramsay A.J, Christensen C.D, Beaton S, Hall D.F, Ramshaw I.A “Expression of Mouse Interleukin-4 by a Recombinant Ectromelia Virus Suppresses Cytolytic Lymphocyte Responses and Overcomes Genetic Resistance to Mousepox.” Journal of Virology, Vol. 75, No.3 (2001): 1205–1210. []
  3. An oligonnucleotide is a short strand of nucleic acids. []
  4. Jeronimo Cello, Aniko V.Paul and Eckard Wimmer “Chemical Synthesis of Poliovirus cDNA: Generation of Infectious Virus in the Absence of Natural Template,” Science, Vol. 297, No. 5583 (August 2002): 1016-1018. []
  5. Jocelyn Kaiser, “Resurrected Influenza Virus Yields Secrets of Deadly 1918 Pandemic” Science Magazine, October 7, 2005; Gina Kolata, “Experts Unlock Clues to Spread of 1918 Flu Virus,” New York Times, October 6, 2005, <>. []
  6. National Research Council (US) Committee on a New Government-University Partnership for Science and Security, Science and Security in a Post 9/11 World, (Washington DC: National Academy Press, 2007). []
  7. Doreen Carvajal, “Security in Flu Was Paramount, Scientist Says,” The New York Times, December 21, 2011, <>; Kolata, “Experts Unlock Clues to Spread of 1918 Flu Virus.” []
  8. Gina Kolata, “Polio Synthesis in the Test Tube,” New York Times, December 13, 1991,  <>; and Sonia Ben Ouagrham-Gormley and Kathleen Vogel, “The Social Context Shaping Bioweapons (Non)proliferation,” Biosecurity and Bioterrorism: Biodefense Strategy, Practice, and Science, Vol. 8, No.1 (March 2010): 9-24. []
  9. Kathleen M. Vogel, “Framing Biosecurity: An Alternative to the Biotech Revolution Model?” Science and Public Policy, Vol. 35, No.1 (February 2008): 45-54. []
  10. K. Jordan and M. Lynch, ‘‘The Sociology of a Genetic Engineering Technique: Ritual and Rationality in the Performance of a ‘Plasmid Prep’,’’ in Adele E. Clarke and Joan H. Fujimura eds., The Right Tools for the Job: At Work in Twentieth-Century Life Sciences  (Princeton, NJ: Princeton University Press, 1992): 77-114. []
  11. Ben Ouagrham-Gormley and Vogel, “The Social Context Shaping Bioweapons (Non)proliferation.” []
  12. Sonia Ben Ouagrham-Gormley, “Barriers to Bioweapons: Intangible Obstacles to Proliferation” International Security, (forthcoming, Spring 2012). []
  13. Ben Ouagrham-Gormley and Vogel, “The Social Context Shaping Bioweapons (Non)proliferation.” []
  14. Interview with Dennis Gormley, who worked for nearly a decade in a U.S. government laboratory on foreign material exploitation and reverse-engineering projects, January 27, 2012, Flint Hill, Virginia. []


Print This Post Print This Post

This entry was posted on Wednesday, April 25th, 2012 at 6:09 pm and is filed under Biotechnology, Proliferation, Research, Security, Technology, Terrorism. You can follow any responses to this entry through the RSS 2.0 feed. You can skip to the end and leave a response. Pinging is currently not allowed.


Leave a Reply