Special Report

Molecular Manufacturing: Too Dangerous to Allow?

by Lifeboat Foundation Scientific Advisory Board member Robert A. Freitas Jr [1]. Original paper is at his site.
 
 

Overview


Close-up of a nanofactory. Courtesy
Lizard Fire Studios. Watch the nanofactory in action!

One common argument against pursuing a molecular assembler or nanofactory design effort is that the end results are too dangerous. According to this argument [2, 3], any research into molecular manufacturing (MM) should be blocked because this technology might be used to build systems that could cause extraordinary damage. The kinds of concerns that nanoweapons systems might create have been discussed elsewhere, in both the nonfictional [4–6] and fictional [7] literature.
 
 

Dangers

Perhaps the earliest-recognized and best-known danger of molecular manufacturing is the risk that self-replicating nanorobots capable of functioning autonomously in the natural environment could quickly convert that natural environment (e.g., “biomass”) into replicas of themselves (e.g., “nanomass”) on a global basis, a scenario often referred to as the “gray goo problem” but more accurately termed “global ecophagy” [4]. As Drexler first warned in Engines of Creation in 1986 [8]:
“Plants” with “leaves” no more efficient than today’s solar cells could out-compete real plants, crowding the biosphere with an inedible foliage. Tough omnivorous “bacteria” could out-compete real bacteria: They could spread like blowing pollen, replicate swiftly, and reduce the biosphere to dust in a matter of days. Dangerous replicators could easily be too tough, small, and rapidly spreading to stop — at least if we make no preparation…. We cannot afford certain kinds of accidents with replicating assemblers.
Such self-replicating systems, if not countered, could make the earth largely uninhabitable [4, 7–9] — concerns that motivated the drafting of the Foresight Guidelines for the safe development of nanotechnology [10]. But, as the Center for Responsible Nanotechnology explains [5], (reference annotations added):
Gray goo would entail five capabilities integrated into one small package. These capabilities are: Mobility — the ability to travel through the environment; Shell — a thin but effective barrier to keep out diverse chemicals and ultraviolet light; Control — a complete set of blueprints and the computers to interpret them (even working at the nanoscale, this will take significant space); Metabolism — breaking down random chemicals into simple feedstock; and Fabrication — turning feedstock into nanosystems. A nanofactory would use tiny fabricators, but these would be inert if removed or unplugged from the factory. The rest of the listed requirements would require substantial engineering and integration [4].
 
Although gray goo has essentially no military and no commercial value, and only limited terrorist value, it could be used as a tool for blackmail. Cleaning up a single gray goo outbreak would be quite expensive and might require severe physical disruption of the area of the outbreak (atmospheric and oceanic goos [4] deserve special concern for this reason). Another possible source of gray goo release is irresponsible hobbyists. The challenge of creating and releasing a self-replicating entity apparently is irresistible to a certain personality type, as shown by the large number of computer viruses and worms in existence. We probably cannot tolerate a community of “script kiddies” [11] releasing many modified versions of goo.
 
Development and use of molecular manufacturing poses absolutely no risk of creating gray goo by accident at any point. However, goo type systems do not appear to be ruled out by the laws of physics, and we cannot ignore the possibility that the five stated requirements could be combined deliberately at some point, in a device small enough that cleanup would be costly and difficult. Drexler’s 1986 statement can therefore be updated: We cannot afford criminally irresponsible misuse of powerful technologies. Having lived with the threat of nuclear weapons for half a century, we already know that.
 

Ignorance is Not Bliss


Less knowledge could mean less defenses. Courtesy The Onion.

Attempts to block or “relinquish” [3, 12] molecular manufacturing research will make the world a more, not less, dangerous place [13]. This paradoxical conclusion is founded on two premises. First, attempts to block the research will fail. Second, such attempts will preferentially block or slow the development of defensive measures by responsible groups. One of the clear conclusions reached by Freitas [4] was that effective countermeasures against self-replicating systems should be feasible, but will require significant effort to develop and deploy. (Nanotechnology critic Bill Joy, responding to this author, complained in late 2000 that any nanoshield defense to protect against global ecophagy “appears to be so outlandishly dangerous that I can’t imagine we would attempt to deploy it.” [12])
 
But blocking the development of defensive systems would simply insure that offensive systems, once deployed, would achieve their intended objective in the absence of effective countermeasures. James Hughes [13] concurs: “The only safe and feasible approach to the dangers of emerging technology is to build the social and scientific infrastructure to monitor, regulate and respond to their threats.”
 
We can reasonably conclude that blocking the development of defensive systems would be an extraordinarily bad idea. Actively encouraging rapid development of defensive systems by responsible groups while simultaneously slowing or hindering development and deployment by less responsible groups (“nations of concern”) would seem to be a more attractive strategy, and is supported by the Foresight Guidelines [10]. As even nanotechnology critic Bill Joy [14] finally admitted in late 2003: “These technologies won’t stop themselves, so we need to do whatever we can to give the good guys a head start.”
 
While a 100% effective ban against development might theoretically be effective at avoiding the potential adverse consequences, blocking all groups for all time does not appear to be a feasible goal. The attempt would strip us of defenses against attack, increasing rather than decreasing the risks. In addition, blocking development would insure that the substantial economic, environmental, and medical benefits [15] of this new technology would not be available.
 
 

A Cure Worse Than The Disease

Observes Glenn Reynolds [16]:
To the extent that such efforts [to ban all development] succeed, the cure may be worse than the disease. In 1875, Great Britain, then the world’s sole superpower, was sufficiently concerned about the dangers of the new technology of high explosives that it passed an act barring all private experimentation in explosives and rocketry. The result was that German missiles bombarded London rather than the other way around.
 
Similarly, efforts to control nanotechnology, biotechnology or artificial intelligence are more likely to drive research underground (often under covert government sponsorship, regardless of international agreement) than they are to prevent research entirely. The research would be conducted by unaccountable scientists, often in rogue regimes, and often under inadequate safety precautions. Meanwhile, legitimate research that might cure disease or solve important environmental problems would suffer.
Finally, and as explained elsewhere [17], it is well-known [18] that self-replication activities, as distinct from the inherent capacity for self-replication, are not strictly required to achieve the anticipated broad benefits of molecular manufacturing. By restricting the capabilities of nanomanufacturing systems simultaneously along multiple design dimensions such as control autonomy (A1), nutrition (E4), mobility (E10), immutability (L3, L4), etc. [19], molecular manufacturing systems — whether microscale or macroscale — can be made inherently safe.
 
 

Richard Feynman’s Nanotechnology Talks


As Phoenix and Drexler [20] noted in a 2004 paper:
In 1959, Richard Feynman pointed out that nanometer-scale machines could be built and operated, and that the precision inherent in molecular construction would make it easy to build multiple identical copies. This raised the possibility of exponential manufacturing, in which production systems could rapidly and cheaply increase their productive capacity, which in turn suggested the possibility of destructive runaway self-replication.
 
Early proposals for artificial nanomachinery focused on small self-replicating machines, discussing their potential productivity and their potential destructiveness if abused…. [But] nanotechnology-based fabrication can be thoroughly non-biological and inherently safe: such systems need have no ability to move about, use natural resources, or undergo incremental mutation.
 
Moreover, self-replication is unnecessary: the development and use of highly productive systems of nanomachinery (nanofactories) need not involve the construction of autonomous self-replicating nanomachines…. Although advanced nanotechnologies could (with great difficulty and little incentive) be used to build such devices, other concerns present greater problems. Since weapon systems will be both easier to build and more likely to draw investment, the potential for dangerous systems is best considered in the context of military competition and arms control.
Of course, it must be conceded that while nanotechnology-based manufacturing systems can be made safe, they also could be made dangerous. Just because free-range self-replicators may be “undesirable, inefficient and unnecessary” [20] does not imply that they cannot be built, or that nobody will build them. How can we avoid “throwing out the baby with the bathwater”?
 
The correct solution, first explicitly proposed by Freitas in 2000 [21] and later partially echoed by Phoenix and Drexler in 2004, [22] starts with a carefully targeted moratorium or outright legal ban on the most dangerous kinds of nanomanufacturing systems, while still allowing the safe kinds of nanomanufacturing systems to be built — subject to appropriate monitoring and regulation commensurate with the lesser risk that they pose.
 
 

Both Safe and Dangerous Technologies

Virtually every known technology comes in “safe” and “dangerous” flavors which necessarily must receive different legal treatment. For example, over-the-counter drugs are the safest and most difficult to abuse, hence are lightly regulated; prescription drugs, more easy to abuse, are very heavily regulated; and other drugs, typically addictive narcotics and other recreational substances, are legally banned from use by anyone, even for medicinal purposes.
 
Artificial chemicals can range from lightly regulated household substances such as Clorox or ammonia; to more heavily regulated compounds such as pesticides, solvents and acids; to the most dangerous chemicals such as chemical warfare agents which are banned outright by international treaties.
 
Another example is pyrotechnics, which range from highway flares, which are safe enough to be purchased and used by anyone; to “safe and sane” fireworks, which are lightly regulated but still available to all; to moderately-regulated firecrackers and model rocketry; to minor explosives and skyrockets, which in most states can be legally obtained and used only by licensed professionals who are heavily regulated; to high-yield plastic explosives, which are legally accessible only to military specialists; to nuclear explosives, the possession of which is strictly limited to a handful of nations via international treaties, enforced by an international inspection agency.
 
Yet another example is aeronautics technology, which ranges from safe unregulated kites and paper airplanes; to lightly regulated powered model airplanes operated by remote control; to moderately regulated civilian aircraft, both small and large; to heavily regulated military attack aircraft such as jet fighters and bombers, which can only be purchased by approved governments; to intercontinental ballistic missiles, the possession of which is strictly limited to a handful of nations via international treaties.
 
Note that in all cases, the existence of a “safe” version of a technology does not preclude the existence of a “dangerous” version, and vice versa. The laws of physics permit both versions to exist. The most rational societal response has been to classify the various applications according to the risk of accident or abuse that each one poses, and then to regulate each application accordingly. The societal response to the tools and products of molecular manufacturing will be no different.
 
Some MM-based tools and products will be deemed safe, and will be lightly regulated. Other MM-based tools and products will be deemed dangerous, and will be heavily regulated, or even legally banned in some cases.
 
Of course, the mere existence of legal restrictions or outright bans does not preclude the acquisition and abuse of a particular technology by a small criminal fraction of the population. For instance, in the high-risk category, drug abusers obtain and inject themselves with banned narcotics; outlaw regimes employ prohibited poison chemicals in warfare; and rogue nations seek to enter the “nuclear club” via clandestine atomic bomb development programs.
 
Bad actors such as terrorists can also abuse less-heavily regulated products such as fully-automatic rifles or civilian airplanes (which are hijacked and flown into buildings). The most constructive response to this class of threat is to increase monitoring efforts to improve early detection and to pre-position defensive instrumentalities capable of responding rapidly to these abuses, as recommended in 2000 by this author [4] in the context of molecular manufacturing.
 
 

Accidents


The risk of accident or malfunction is less problematic for new technologies than the dangers of abuse. Engineers generally try to design products that work reliably and companies generally seek to sell reliable products to maintain customer goodwill and to avoid expensive product liability lawsuits. But accidents do happen. Here again, our social system has established a set of progressive responses to deal efficiently with this problem.
 
A good example is the ancient technology of fire. The uses of fire are widespread in society, ranging from lightly-regulated matchsticks, butane lighters, campfires, and internal combustion engines, to more heavily regulated home HVAC furnaces, municipal incinerators and industrial smelters.
 
A range of methods are available to deal quickly and effectively with a fire that has accidentally escaped the control of its user. Home fires due to a smoldering cigarette or a blazing grease pan in the kitchen are readily doused using a common household fire extinguisher. Fires in commercial buildings (e.g., hotels) or industrial buildings (e.g., factories) are automatically quenched by overhead sprinkler systems.
 
When these methods prove insufficient to snuff out the flames, the local fire department is called in to limit the damage to just a single building, using fire trucks, water hoses and hydrants. If many buildings are involved, more extensive fire suppression equipment and hundreds of firefighters can be brought in from all across town to hold the damage to a single city block.
 
In the case of the largest accidental fires, like forest fires, vast quantities of heavy equipment are deployed including thousands of firefighters wielding specialized tools, bulldozers to dig firebreaks, helicopters with pendulous water buckets, and great fleets of air tankers dropping tons of fire retardants. (These progressive measures also protect the public in cases of deliberate arson.)
 
The future emergency response hierarchy for dealing with MM-based accidents will be no less exhaustive and may be equally effective in preserving human life and property, while allowing us to enjoy the innumerable benefits of this new technology. Notes Steen Rasmussen of Los Alamos National Laboratory in New Mexico: “The more powerful technology you unleash, the more careful you have to be.” [23]
 
The study of the ethical [24], socioeconomic [25–28] and legal [29] impact of replication-capable machines such as molecular assemblers and machines such as nanofactories that could build replicators is still in its earliest stages, and there is additional discussion of safety issues elsewhere [30]. However, two important general observations about replicators and self-replication should be noted here.
 
 

Replication


First, replication is nothing new. Humanity has thousands, arguably even millions, of years of experience living with entities that are capable of kinematic self-replication. These replicators range from the macroscale (e.g., insects, birds, horses, other humans) on down to the microscale (e.g. bacteria, protozoa) and even the nanoscale (e.g., prions, viruses).
 
As a species, we have successfully managed the eternal tradeoff between risk and reward, and have successfully negotiated the antipodes of danger and progress. There is every reason to expect this success to continue. (As shown by the problem of invasive species, the biosphere requires time to adapt to new replicators, so human intervention may be required to prevent severe damage.)
 
The technologies of engineered self-replication, even at the microscale, are already in wide commercial use throughout the world. Indeed, human civilization is utterly dependent on self-replication technologies. Many important foods including beer, wine, cheese, yogurt, and kefir (a fermented milk), along with various flavors, nutrients, vitamins and other food ingredients, are produced by specially cultured microscopic replicators such as algae, fungi (yeasts) and bacteria.
 
Virtually all of the rest of our food is made by macroscale replicators such as agricultural crop plants, trees, and farm animals. Many of our most important drugs are produced using microscopic self-replicators — from penicillin produced by natural replicating molds starting in the 1940s [15] to the first use of artificial (engineered) self-replicating bacteria to manufacture human insulin by Eli Lilly in 1982 [31]. These uses continue today in the manufacture of many other important drug products such as: (a) human growth hormone (HGH) and erythropoietin (EPO), (b) precursors for antibiotics such as erythromycin [32], and (c) therapeutic proteins such as Factor VIII.
 
A few species of self-replicating bacteria are even used directly as therapeutic medicines, such as the widely available swallowable pills containing bacteria (i.e., natural biological nanomachines) for gastrointestinal refloration, as for example SalivarexTM which “contains a minimum of 2.9 billion beneficial bacteria per capsule” [33], and AlkadophilusTM which “contains 1.5 billion organisms per capsule” [34], both at a 2005 price of ~$(0.1–0.2) x 10-9 per microscale replicator (i.e., per bacterium).
 
Some replicating viruses, notably bacteriophages, are used as therapeutic agents to combat and destroy unhealthful infectious bacterial replicators [35], and for decades viruses have served as transfer vectors to attempt gene therapies [36]. In industry, bacteria are already employed as “self-replicating factories” [37] for various useful products, and microorganisms are also used as workhorses for environmental bioremediation [38, 39], biomining of heavy metals [40], and other applications. In due course, we will learn to safely harness the abilities of nonbiological replication-capable machines for human benefit as well.
 
Second, replicators can be made inherently safe. An “inherently safe” kinematic replicator is a replicating system that, by its very design, is inherently incapable of surviving mutation or of undergoing evolution (and thus evolving out of our control or developing an independent agenda), and that, equally importantly, does not compete with biology for resources (or worse, use biology as a raw materials resource [4]).
 
 

Conclusion

One primary route for ensuring inherent safety is to combine the broadcast architecture for control [41] and the vitamin architecture for materials [42], which together eliminate the likelihood that the system can replicate outside of a very controlled and highly artificial setting. There are numerous other routes to this end [10, 19]. Many dozens of additional safeguards may be incorporated into replicator designs to provide redundant embedded controls and thus an arbitrarily low probability of replicator malfunctions of various kinds, simply by selecting the appropriate design parameters [19].
 
Artificial kinematic replication-capable systems which are not inherently safe should not be designed or constructed, and indeed should be legally prohibited by appropriate juridical and economic sanctions, with these sanctions to be enforced in both national and international regimes.
 
In the case of individual lawbreakers or rogue states that might build and deploy unsafe artificial mechanical replicators, the defenses we have already developed against harmful biological replicators all have analogs in the mechanical world that should provide equally effective, or even superior, defenses. Molecular manufacturing will make possible ever more sophisticated methods of environmental monitoring, prophylaxis and safety. However, advance planning and strategic foresight will be essential in maintaining this advantage.
 
 

Notes and References

1
An earlier version of this essay appeared as portions of Sections 5.11 and 6.3.1 in: Robert A. Freitas Jr., Ralph C. Merkle, Kinematic Self-Replicating Machines, Landes Bioscience, Georgetown TX, 2004, p. 199 and pp. 204–206.
2
Sean Howard, “Nanotechnology and mass destruction: The need for an inner space treaty,” Disarmament Diplomacy 65 (2002); Lee-Anne Broadhead, Sean Howard, “The Heart of Darkness,” Resurgence #221, November/December 2003.
3
Bill Joy, “Why the future doesn’t need us”, Wired 8(April 2000). Response by Ralph Merkle, “Text of prepared comments by Ralph C. Merkle at the April 1, 2000 Stanford Symposium organized by Douglas Hofstadter”.
4
Robert A. Freitas Jr., “Some Limits to Global Ecophagy by Biovorous Nanoreplicators, with Public Policy Recommendations” Zyvex preprint, April 2000.
5
“Dangers of Molecular Manufacturing” Center for Responsible Nanotechnology, 2004.
6
K. Eric Drexler, “Chapter 11. Engines of Destruction”, Engines of Creation: The Coming Era of Nanotechnology, Anchor Press/Doubleday, New York, 1986. Mark Avrum Gubrud, “Nanotechnology and international security”, paper presented at the 5th Foresight Conference, November 1997. Lev Navrozov, “Molecular nano weapons: Research in China and talk in the West”, NewsMax.com, 27 February 2004. Jurgen Altmann, “Military uses of nanotechnology: Perspectives and concerns”, Security Dialogue 35(March 2004):61–79. Ray Kurzweil, The Singularity is Near: When Humans Transcend Biology, Penguin Books, New York, 2005.
7
Michael Crichton, Prey, HarperCollins Publishers, New York, 2002. Britt D. Gillette, Conquest of Paradise: An End-times Nano-Thriller, Writers Club Press, New York, 2003. John Robert Marlow, Nano, St. Martin’s Press, New York, 2004.
8
K. Eric Drexler, Engines of Creation: The Coming Era of Nanotechnology, Anchor Press/Doubleday, New York, 1986.
9
Philip K. Dick, “Second Variety,” Space Science Fiction, May 1953; also available in: Philip K. Dick, Second Variety and Other Classic Stories by Philip K. Dick, Citadel Press, 1991. Greg Bear, The Forge of God, Gollancz, New York, 1987. Greg Bear, Anvil of Stars, Century, London, U.K., 1992.
10
Foresight Institute, “Molecular Nanotechnology Guidelines: Draft Version 3.7”, 4 June 2000. Extensive excerpt.
11
According to cyberjournalist Clive Thompson [43], elite writers of software viruses openly publish their code on Web sites, often with detailed descriptions of how the program works, but don’t actually release them. The people who do release the viruses are often anonymous mischief-makers, or “script kiddies” — a derisive term for aspiring young hackers, “usually teenagers or curious college students, who don’t yet have the skill to program computers but like to pretend they do. They download the viruses, claim to have written them themselves and then set them free in an attempt to assume the role of a fearsome digital menace. Script kiddies often have only a dim idea of how the code works and little concern for how a digital plague can rage out of control. Our modern virus epidemic is thus born of a symbiotic relationship between the people smart enough to write a virus and the people dumb enough — or malicious enough — to spread it.”
 
Thompson goes on to describe his early 2004 visit to an Austrian programmer named Mario, who cheerfully announced that in 2003 he had created, and placed online at his website, freely available, a program called “Batch Trojan Generator” that autogenerates malicious viruses. Thompson described a demonstration of this program: “A little box appears on his laptop screen, politely asking me to name my Trojan. I call it the ‘Clive’ virus. Then it asks me what I’d like the virus to do. Shall the Trojan Horse format drive C:? Yes, I click. Shall the Trojan Horse overwrite every file? Yes. It asks me if I’d like to have the virus activate the next time the computer is restarted, and I say yes again. Then it’s done. The generator spits out the virus onto Mario’s hard drive, a tiny 3KB file. Mario’s generator also displays a stern notice warning that spreading your creation is illegal. The generator, he says, is just for educational purposes, a way to help curious programmers learn how Trojans work. But of course I could ignore that advice.”
 
Apparently top “malware” writers do take some responsible precautions, notes Thompson. For example, one hacker’s “main virus-writing computer at home has no Internet connection at all; he has walled it off like an airlocked biological-weapons lab, so that nothing can escape, even by accident.” Some writers, after finishing a new virus, “immediately e-mail a copy of it to antivirus companies so the companies can program their software to recognize and delete the virus should some script kiddie ever release it into the wild.”
12
Bill Joy, “Act now to keep new technologies out of destructive hands”, New Perspectives Quarterly 17(Summer 2001.
13
James J. Hughes, “Relinquishment or Regulation: Dealing with Apocalyptic Technological Threats”, Trinity College, Fall 2001.
14
Spencer Reiss, “Hope Is a Lousy Defense”, Wired, December 2003.
15
Robert A. Freitas Jr., Nanomedicine, Volume I: Basic Capabilities, Landes Bioscience, Georgetown, TX, 1999. Robert A. Freitas Jr., Nanomedicine, Volume IIA: Biocompatibility, Landes Bioscience, Georgetown, TX, 2003. Robert A. Freitas Jr., “Current Status of Nanomedicine and Medical Nanorobotics (Invited Survey)”, J. Comput. Theor. Nanosci. 2(March 2005):1–25.
16
Glenn Harlan Reynolds “Techno Worries Miss the Target”, SpeakOut.com, 8 June 2000.
17
Robert A. Freitas Jr., Ralph C. Merkle, Kinematic Self-Replicating Machines, Landes Bioscience, Georgetown TX, 2004; Sections 3.13.2.2, 4.9.3, 4.14, 4.17, 4.19, 5.7, 5.9.4.
18
K. Eric Drexler, Nanosystems: Molecular Machinery, Manufacturing, and Computation, John Wiley & Sons, New York, 1992.
19
Robert A. Freitas Jr., Ralph C. Merkle, Kinematic Self-Replicating Machines, Landes Bioscience, Georgetown TX, 2004, Section 5.1.9. The notations (A1, etc.) refer to specific sections in the cited literature.
20
Chris Phoenix, Eric Drexler, “Safe exponential manufacturing” Nanotechnology 15(2004):869–872. See also: Paul Rincon, “Nanotech guru turns back on ‘goo’”, BBC News Online UK Edition, 9 June 2004; and Liz Kalaugher, “Drexler dubs ‘grey goo’ fears obsolete”, Nanotechweb.org, 9 June 2004.
21
From Freitas (2000) [4]: “Specific public policy recommendations suggested by the results of the present analysis include: (1) an immediate international moratorium on all artificial life experiments implemented as nonbiological hardware. In this context, ‘artificial life’ is defined as autonomous foraging replicators, excluding purely biological implementations (already covered by NIH guidelines tacitly accepted worldwide) and also excluding software simulations which are essential preparatory work and should continue. Alternative ‘inherently safe’ replication strategies such as the broadcast architecture are already well-known….”
22
From Phoenix and Drexler (2004) [20]: “The construction of anything resembling a dangerous self-replicating nanomachine can and should be prohibited.”
23
Ronald Kotulak, “Science on verge of new ‘Creation’: Labs say they have nearly all the tools to make artificial life” Sun-Sentinel Tribune, 28 March 2004.
24
David S. Goodsell, Bionanotechnology: Lessons from Nature, John Wiley & Sons, New York, 2004.
25
Robert A. Freitas Jr., William P. Gilbreath, eds., Advanced Automation for Space Missions, NASA Conference Publication CP-2255 (N83–15348), 1982; and Robert A. Freitas Jr., “Noninflationary Nanofactories”, Nanotechnology Perceptions 2 (May 2006).
26
Murray Leinster, The Duplicators, Ace Books, New York, 1964; originally published as “The Lost Race”, Thrilling Wonder Stories, April 1949. Gerald D. Nordley, “On the socioeconomic impact of smart self-replicating machines”, CONTACT 2000, NASA/Ames Research Center.
27
V. Weil, “Ethical Issues in Nanotechnology,” in M.C. Roco, W.S. Bainbridge, eds., Societal Implications of Nanoscience and Nanotechnology, Kluwer, Dordrecht, 2001, pp. 193–198. R.H. Smith, “Social, Ethical, and Legal Implications of Nanotechnology,” in M.C. Roco, W.S. Bainbridge, eds., Societal Implications of Nanoscience and Nanotechnology, Kluwer, Dordrecht, 2001, pp. 203–211. See also the PDF version.
28
“Task Area 3: Problems of Self-replication, Risk, and Cascading Effects in Nanotechnology: Analogies between Biological Systems and Nanoengineering” in Philosophical and Social Dimensions of Nanoscale Research — From Laboratory to Society: Developing an Informed Approach to Nanoscale Science and Technology, Working Group for the Study of the Philosophy and Ethics of Complexity and Scale [SPECS], University of South Carolina NanoCenter, 17 March 2003.
29
Frederick A. Fiedler, Glenn H. Reynolds, “Legal Problems of Nanotechnology: An Overview”, Southern California Interdisciplinary Law Journal 3(1994):593–629. Ty S. Wahab Twibell, “Nano law: The legal implications of self-replicating nanotechnology”, Nanotechnology Magazine, 2000. John Miller, “Beyond Biotechnology: FDA Regulation Of Nanomedicine”, Columbia Science and Technology Law Review, Vol. IV, 2002–2003. Glenn Harlan Reynolds, “Nanotechnology and regulatory policy: three futures” Harv. J. Law & Technol. 17 (Fall 2003).
30
Robert A. Freitas Jr., Ralph C. Merkle, Kinematic Self-Replicating Machines, Landes Bioscience, Georgetown TX, 2004; Sections 2.1.5, 2.3.6, 5.1.9(L), 6.3.1, 6.4.4.
31
“Milestones in Medical Research”, Eli Lilly.
32
B.A. Pfeifer, S.J. Admiraal, H. Gramajo, D.E. Cane, Chaitan Khosla, “Biosynthesis of complex polyketides in a metabolically engineered strain of E. coli”, Science 291(2 March 2001):1790–1792, 1683 (comment).
33
“L-Salivarius Plus Other Beneficial Microflora”, Product Information Sheet No. 8058, Life Plus, 1996; “Life Plus Vitamin/Herbal Answer For a Healthy Digestive Tract”; “Support Digestion Naturally: Salivarex”.
34
“Alkadophilus: The Non-Refrigerated Acidophilus”, also here.
35
R.J. Payne, D. Phil, V.A. Jansen, “Phage therapy: the peculiar kinetics of self-replicating pharmaceuticals” Clin. Pharmacol. Ther. 68(September 2000):225–230.
36
Michael G. Kaplitt, Arthur D. Loewy, eds., Viral Vectors: Gene Therapy and Neuroscience Applications, Academic Press, New York, 1995. Angel Cid-Arregui, Alejandro Garcia-Carranca, eds., Viral Vectors: Basic Science and Gene Therapy, Eaton Publishing Co., 2000. David Latchman, Viral Vectors for Treating Diseases of the Nervous System,, Academic Press, New York, 2003. Curtis A. MacHida, Jules G. Constant, eds., Viral Vectors for Gene Therapy: Methods and Protocols, Humana Press, 2002.
37
Jonathan King, “Chapter 9. The biotechnology revolution: self-replicating factories and the ownership of life forms,” in Jim Davis, Thomas A. Hirschl, Michael Stack, eds., Cutting Edge: Technology, Information Capitalism and Social Revolution, Verso Books, 1997. M. Kleerebezemab, P. Hols, J. Hugenholtz, “Lactic acid bacteria as a cell factory: rerouting of carbon metabolism in Lactococcus lactis by metabolic engineering”, Enzyme Microb. Technol. 26(1 June 2000):840–848. J. Hugenholtz, M. Kleerebezem, M. Starrenburg, J. Delcour, W. de Vos, P. Hols, “Lactococcus lactis as a cell factory for high-level diacetyl production”, Appl. Environ. Microbiol. 66 (September 2000):4112–4114. Bernard R. Glick, Jack J. Pasternak, Molecular Biotechnology: Principles and Applications of Recombinant DNA, American Society for Microbiology, Washington, DC, 2003.
38
According to Press [44]: “The first patented form of life produced by genetic engineering was a greatly enhanced oil-eating microbe. The patent [45] was registered to Dr. Ananda Chakrabarty of the General Electric Company in 1981 and was initially welcomed as an answer to the world’s petroleum pollution problem. But anxieties about releasing ‘mutant bacteria’ soon led the U.S. Congress and the Environmental Protection Agency (EPA) to prohibit the use of genetically engineered microbes outside of sealed laboratories.
 
The prohibition set back bioremediation for a few years, until scientists developed improved forms of oil-eating bacteria without using genetic engineering. After large-scale field tests in 1988, the EPA reported that bioremediation eliminated both soil and water-borne oil contamination at about one-fifth the cost of previous methods. Since then, bioremediation has been increasingly used to clean up oil pollution on government sites across the United States.”
39
P. Kotrba, L. Doleckova, V. de Lorenzo, T. Ruml, “Enhanced bioaccumulation of heavy metal ions by bacterial cells due to surface display of short metal binding peptides”, Appl. Environ. Microbiol. 65(March 1999):1092–1098; W. Bae, R.K. Mehra, A. Mulchandani, W. Chen, “Genetic engineering of Escherichia coli for enhanced uptake and bioaccumulation of mercury”, Appl. Environ. Microbiol. 67(November 2001):5335–5338; X. Deng, Q.B. Li, Y.H. Lu, D.H. Sun, Y.L. Huang, X.R. Chen, “Bioaccumulation of nickel from aqueous solutions by genetically engineered Escherichia coli”, Water Res. 37(May 2003):2505–2511.
40
I. Suzuki, “Microbial leaching of metals from sulfide minerals”, Biotechnol. Adv. 19(1 April 2001):119–132. D.V. Rao, C.T. Shivannavar, S.M. Gaddad, “Bioleaching of copper from chalcopyrite ore by fungi”, Indian J. Exp. Biol. 40(March 2002):319–324. D.E. Rawlings, D. Dew, C. du Plessis, “Biomineralization of metal-containing ores and concentrates”, Trends Biotechnol. 21(January 2003):38–44. G.J. Olson, J.A. Brierley, C.L. Brierley, “Bioleaching review part B: progress in bioleaching: applications of microbial processes by the minerals industries”, Appl. Microbiol. Biotechnol. 63(December 2003):249–257.
41
Robert A. Freitas Jr., Ralph C. Merkle, Kinematic Self-Replicating Machines, Landes Bioscience, Georgetown TX, 2004, Section 4.11.3.3.
42
Robert A. Freitas Jr., Ralph C. Merkle, Kinematic Self-Replicating Machines, Landes Bioscience, Georgetown TX, 2004, Section 4.3.7.
43
Clive Thompson, “The Virus Underground”, The New York Times, 8 February 2004.
44
Joseph Henry Press, “Chapter 5. Biotechnology and the Environment,” Biotechnology Unzipped: Promises and Realities, National Academy of Sciences, Washington, DC, 2003, pp. 134–160.
45
Ananda M. Chakrabarty, “Microorganisms having multiple compatible degradative energy-generating plasmids and preparation thereof”, United States Patent No. 4,259,444, 31 March 1981; Ananda M. Chakrabarty, Scott T. Kellogg, “Bacteria capable of dissimilation of environmentally persistent chemical compounds”, United States Patent No. 4,535,061, 13 August 1985.