Nanotechnology: where it stands today by K. Eric Drexler The following essay is adapted from the new Afterword to Engines of Creation (Doubleday, 1986) written for the British edition to be published in 1990 by Fourth Estate. What would I correct in Engines today, after several years of discussion, criticism, and technological progress? The first dozen pages would report recent advances in technology (discussed below), but the conclusion would remain the same: we are moving toward assemblers, toward an era of molecular manufacturing giving thorough and inexpensive control of the structure of matter. There would be no changes in the central theses, because they seem solid. Advancing technologies Technological progress has, on the whole, been faster than I had expected. Engines speculates about when we might reach the milestone of designing a protein molecule from scratch, but avoids making any rash prediction of a date. In fact, this was accomplished in 1988 by William F. DeGrado of Du Pont and his colleagues.(*1) They designed a small protein, (*alpha)(*subscript: 4), which is substantially more stable than natural proteins of comparable size. This success, which precedes any general way of predicting what structure a natural protein chain will assume, and the unusual stability of the product, confirm predictions which I had published in 1981 regarding the feasibility and capabilities of protein engineering.(*2) Protein Engineering, now the title of a journal, has become a major academic and industrial enterprise. Groups around the world are making small modifications to natural proteins, swapping sections of natural proteins to build new structures, and building new designs starting with a clean slate. A related area of research has been underway for years and is picking up speed: the design and synthesis of smaller molecules having protein-like capabilities, but non-protein-like structures. These molecules bind other molecules, building up larger structures; some can serve as enzyme-like catalysts. In 1987, a Nobel prize for pioneering research in this area (which is commonly referred to as "molecular recognition") was shared by Charles Pederson of Du Pont, Donald Cram of UCLA, and Jean-Marie Lehn of the Universite' Louis Pasteur.(*3) Building protein-like chains from molecular structures capable of "recognition" is a promising technique for use in engineering molecular systems. Such a chain could fold (or bind a neighbor) in a manner predetermined by the properties of these building blocks, simplifying the problems faced by the designer. Building molecules that resemble proteins, save for differences chosen to aid design, is an attractive strategy. At Universitat Basel in Switzerland, Manfred Mutter has had striking successes with protein-like molecules based on branched--rather than linear--polymer chains. The potential of such pseudo-proteins has only begun to be explored: to biochemists they may seem uninteresting, because they differ from nature; to chemists, they may seem uninteresting, because they attempt to avoid fascinating (i.e., difficult) problems of structure and synthesis. But to molecular technologists trying to build ambitious, functional systems (rather than academically-impressive components), they have real appeal. Computer-based tools for modeling molecules have improved rapidly.(*4) To describe the behavior of a large molecule requires calculations which gobble computer time, but the computer power available for a given price has grown exponentially over the years. Still, accurate quantum-mechanical calculations are possible only for very small molecules, because the amount of computer time needed for such calculations grows sharply as molecular size increases. Calculations that treat molecules as objects, with size, shape, and moving parts are practical on a large scale, however. With these approximations, it is now routine to calculate the behavior of protein molecules, following the motions of thousands of atoms. Design tools, too, have improved. Jay Ponder and Frederic Richards of Yale University have developed a program that can determine which sequences of amino acids will be able to form a stable, tightly-packed protein core. Tom Blundell and his colleagues at the University of London have developed a program, COMPOSER, which aims to find parts of known protein structures that will fit together to form new molecules. These design tools, combined with ever-better strategies for design and synthesis, promise ever more rapid advances in engineering molecular objects of kinds which can serve as tools and components on the path to nanotechnology. In a note, Engines mentions the scanning tunneling microscope (STM), and suggests that it "may be able to replace molecular machinery in positioning molecular tools." The STM is a device which can map the bumps and hollows of a conducting surface--often in atomic detail--by scanning a sharp probe above the surface just close enough for electrons to jump the gap at an appreciable rate; it has since been joined by a relative, the atomic force microscope (AFM), which senses the force between a probe and a surface, rather than sensing an electrical current. At IBM's Almaden research center, John Foster's group has observed and modified individual molecules using the technology of the scanning tunneling microscope.(*5) A voltage pulse can pin a molecule to a graphite surface, forming a chemical bond; further pulses can fragment or remove the pinned molecule, all visible in the STM images. (The possibility of building a computer memory device immediately suggests itself.) A group at AT&T Bell Labs has deposited what are thought to be single atoms of germanium onto a germanium surface, again using voltage pulses applied to an STM tip; the process does not work with silicon, despite its similarities to germanium.(*6) These processes are, however, relatively uncontrolled in a molecular sense: neither can reliably produce a particular chemical change at a particular location, as an assembler must. John Foster, however, has expressed strong interest in pursuing my suggestion of using specially-engineered molecular tools on probe tips. The practical consequences of using such molecules as tools for chemical synthesis are unclear, since the resulting device could build only one molecule at a time, but work in this direction could lead to a crude protoassembler in the not-too-distant future. Sufficiently accurate positioning mechanisms exist in the AFM and STM; the technology needed to engineer molecules that bind to other molecules is maturing, and could likely be adapted to engineer molecules that bind to probe tips. Putting these developments together into a working system, though a challenging objective, seems entirely feasible. In short, advances toward nanotechnology through molecular systems engineering have been more rapid than Engines might suggest. This makes understanding and preparation that much more urgent. The spread of ideas The idea of nanotechnology has spread far, both through Engines itself (Japanese and British editions are planned for 1990) and through other publications. A recent summary appears in the 1990 Britannica yearbook, Science and the Future.(*7) The Foresight Institute has provided a publication medium for news and discussion on nanotechnology and (to a lesser extent) for news regarding developments at the frontiers of software technology. Interest in the U.S. may be gauged, in part, by the demand for talks on the subject. I have been invited to speak at most of the top technical universities and many of the top corporate research laboratories in the U.S. At Stanford, when I taught the first university course on nanotechnology, the room and hallway were packed with students on the first day, and the last to enter enter climbed through a window. Interest has been strong and growing. Mutant memes As information spreads, so does misinformation. Thus far, fragmentation has not been a great problem, because the core ideas in Engines have traveled more-or-less as a package. In particular, the idea of nanotechnology as powerful, potentially dangerous, potentially beneficial, and effectively inevitable has remained intact, even in most presentations in the media. A more common problem has been a loss of distinctions; this has been especially visible in press coverage. Since there has now been considerable experience with how these ideas are distorted, perhaps distortion can be minimized by pointing out the patterns. Minds lacking suitable education seem to have only one mental pigeon-hole for "invisibly small", hence distinctions of scale among invisibly small objects tend to collapse. Let's see, are cells smaller than molecules, or vice versa? Are nanocomputers the size of atoms? Of molecules? Of cells? These matters have often led to confusion, though the differences in scale among these objects are enormous. From the 30 micron scale of a fairly typical human cell to the 0.3 nanometer scale of a typical atom is a factor of 100,000 in linear dimension, or a factor of 1,000,000,000,000,000 in volume. This is the difference between a mountain and a marble. Atoms make up molecules, which make up cells, with nanocomputers far larger than typical molecules, and yet far smaller than typical cells. In a similar vein, microtechnology is often confused with nanotechnology, despite the 1,000,000,000-fold difference in the volume of a typical part, and despite the radical difference between a technology which miniaturizes bulk processes and one which guides a series of chemical reactions to build objects with molecular precision. Microtechnologists in the U.S. have been justifiably upset by media coverage which first describes their crude, hard-won micromotors and then says, in effect, "But this is nothing compared with nanotechnology." Microtechnology is practice; nanotechnology is still theory. They are hardly comparable. Further, if (as often happens) nanotechnology is portrayed as an outgrowth of microtechnology, it seems fanciful, much as if someone asserted that miniaturizing bulldozers would let us build fine watches out of dirt. With a bit more justification, some say that nanotechnology is "just chemistry," or describe early protein engineering as nanotechnology. Here, at least, there is a continuum of technologies; chemistry is indeed moving toward nanotechnology. Nonetheless, there is a huge gap between present practice and the sort of nanotechnology described in Engines. The difference between, say, an enzyme and an assembler-based manufacturing system is as large as the difference between a transistor and a computer. If one wants to say that nanotechnology is just chemistry, this is true in the same sense that computer engineering is just solid-state physics and circuit theory. Strategies for making small systems divide into top-down and bottom-up approaches. In top-down approaches, the challenge is to make devices smaller and smaller, and the atomic graininess of matter looms as a growing problem. In bottom-up strategies, the atomic graininess of matter is fundamental, and the challenge is to build larger and larger objects while retaining full control of structure. Chemistry works bottom up; strategies based on STM or AFM positioning mechanisms will likewise work bottom up. These strategies thus blur into nanotechnology as conventional microtechnology does not. If the word "nanotechnology" is captured to describe some modest extension of microtechnology or chemistry, then we will need a new term to describe the technological developments central to the discussion in Engines. "Molecular manufacturing" might be a good, descriptive choice. In discussing nanotechnology proper, writers have tended to collapse distinctions among different kinds of nanomachines. Engines discusses assemblers, nanocomputers, replicators, cell repair machines, and the prospect of achieving genuine artificial intelligence with the aid of massively expanded computational resources. How's that again? Ah, yes, nanotechnology will be based on self-replicating, artificially-intelligent molecular machines, right? Wrong, obviously. Not all molecular machines will include assemblers, for the same reason that not all macroscopic machines include stereophonic sound systems. Likewise, replicators will be a special class of device, and only a highly-skilled and hardworking fool would build replication abilities into every piece of machinery. Even long after nanotechnology matures, genuine artificial intelligence may not be found in anything whatsoever; it surely will not be found in the microscopic equivalent of today's microprocessors! And if the above collapse of distinctions seems too absurd to mention, I should note that it was represented as my view of nanotechnology by a social scientist who subsequently made the first presentation on the subject to the U.S. National Science Foundation. As one would expect, it and its mutant relatives also turn up in the popular press. In light of the above confusions, it is no surprise that people collapse the distinction between nanomachines and living cells. After all, both are small and contain molecular machinery, and they don't even differ in size by a factor of a billion. Nonetheless, confusing a bacterium with an Engines-style nanomachine is like confusing a rat with a radio-controlled model car. One is an evolved, flexible organic system; the other is a designed, inflexible machine. One forages in nature; the other requires special fuel. The differences in style, organization, and abilities run deep. We may someday learn to make nanomachines that are biological in style and flexibility without closely copying nature, but this would require a special research program. The problems involved are far beyond and quite different from those that will arise in making the computers, robotic arms, and the rest of conventional nanotechnology. Some imagine that small machines will be magical and omnicompetent--flexible, evolving, intelligent, and the rest. But making things small will not automatically make them wondrous in all respects. Small machines will be machines. Making one nanomachine do something useful will take hard work. Making another do something else will take yet more work. Selection pressures The idea of nanotechnology has now been in wide circulation for several years. What has been the reaction of the technical community--of those best able to find and label erroneous ideas? There has been a background level of pooh-poohing, much of which no doubt derives from media-based misimpressions. The interesting question, however, is not what educated people may say in an off-hand remark over lunch, but what they say when they grapple with the ideas. From where I stand (e.g., in front of questioning technical audiences after giving a technical talk) the central theses of Engines look solid: They have withstood criticism. This is not to say that everyone accepts them, but merely that whenever someone has suggested a reason for rejecting them, that reason has turned out to be faulty. (My apologies to hidden critics with novel, substantive points--please step out and speak up!) Engines typically argues by example and by the engineering principle of composition (e.g., since molecular machines do exist, they can exist, and since simple machines can be composed to make complex machines, complex molecular machines can likewise exist, etc.). Such propositions are robust and hard to refute, but they lack the specificity and mathematical analysis commonly found in technical argumentation. As a result, some leaders of the scientific and technological community have found these arguments convincing, while others still imagine that no argument, much less a convincing argument, has even been made. For those in search of more technical and mathematical detail, a variety of technical papers (on mechanical nanocomputers, molecular gears and bearings, etc.) are available and a technical book is on the way.(*8) After a series of local meetings, the Foresight Institute sponsored the first major conference on nanotechnology in October 1989 (covered in the 4 November Science News); a proceedings volume is in preparation. This conference gave participants a chance to hear presentations from leaders in protein engineering, self-assembling molecular systems, quantum electronics, scanning tunneling microscopy, and other relevant fields, all in the context of nanotechnology, its development, and its consequences. The result was a far better picture of an issue that Engines left in soft focus: how nanotechnology will actually be developed. At the conference, it also became clear that Japan has for several years been treating molecular systems engineering as a key to 21st century technology. If the rest of the world wishes to see cooperative development of nanotechnology, it had best wake up and start doing its part. It might also be wise not to escalate an international war of words, tariffs (and more?) over such trivia of late 20th century industry as cars, VCRs, chips, and whatnot. In the late 19th century, guano deposits were of enormous economic and strategic value, but this evaporated in the early 20th century with the advent of the Haber process for nitrogen fixation. Let us not disrupt productive alliances over the modern equivalent of guano. Afterthoughts and further information Certain scenarios and proposals in the last third of Engines could bear rephrasing, but at least one problem is presented misleadingly. Page 173 speaks of the necessity of avoiding runaway accidents with replicating assemblers; today I would emphasize that there is little incentive to build a replicator even resembling one that can survive in nature. Consider cars: to work, they require gasoline, oil, brake fluid, and so forth. No mere accident could enable a car to forage in the wild and refuel from tree sap: this would demand engineering genius and hard work. Likewise with simple replicators designed to work in vats of assembler-fluid, making non-replicating products for use outside. Replicators built in accord with suitable regulations would not even resemble anything that could run wild. The problem--and it is enormous--is one not ofaccidents, but of abuse. Some have mistakenly imagined that my aim is to promote nanotechnology; it is, instead, to promote understanding of nanotechnology and its consequences, which is another matter entirely. Nonetheless, I am now persuaded that the sooner we start serious development efforts, the longer we will have for serious public debate. Why? Because serious debate will start with those serious efforts, and the sooner we start, the poorer our technology base will be. An early start will thus mean slower progress and hence more time to consider the consequences. * * * * * * * K. Eric Drexler is a researcher concerned with emerging technologies and their consequences for the future. He serves as president of the Foresight Institute. Notes and References 1. A good review of this and related work may be found in William F. DeGrado, Zelda R. Wasserman, and James D. Lear, Protein Design, a Minimalist Approach, Science Vol. 243, pp. 622Ð628 (1989). 2. Molecular engineering: An approach to the development of general capabilities for molecular manipulation, Proceedings of the National Academy of Sciences (USA) Vol. 78, pp. 5275-5278 (1981). 3. Of particular interest are the Nobel lectures of the two currently active researchers. See Jean-Marie Lehn, Supramolecular Chemistry--Scope and Perspectives: Molecules, Supermolecules, and Molecular Devices, Angewandte Chemie International Edition in English Vol. 27, pp. 89-112 (1988) and Donald J. Cram, The Design of Molecular Hosts, Guests, and Their Complexes, Science Vol. 240, pp. 760-767 (1988). 4. Software useful for computer-aided design of protein molecules has been described by C. P. Pabo and E. G. Suchanek in Computer-Aided Model-Building Strategies for Protein Design, Biochemistry Vol. 25, pp. 5987-5991 (1986), and by Tom Blundell et al., Knowledge-based protein modelling and design, European Journal of Biochemistry Vol. 172, pp. 513-520 (1988); a program which reportedly yields excellent results in designing hydrophobic side-chain packings for protein core regions is described by Jay W. Ponder and Frederic M. Richards in Tertiary Templates for Proteins, Journal of Molecular Biology Vol. 193, pp. 775-791 (1987). The latter authors have also done work in molecular modeling (an enormous and active field); see An Efficient Newton-like Method for Molecular Mechanics Energy Minimization of Large Molecules, Journal of Computational Chemistry Vol. 8, pp. 1016-1024 (1987). Computational techniques derived from molecular mechanics have been used to model quantum effects on molecular motion (as distinct from quantum-mechanical modeling of electrons and bonds); see Chong Zheng et al., Quantum simulation of ferrocytochrome c, Nature Vol. 334, pp. 726-728 (1988). 5. J. S. Foster, J. E. Frommer, and P. C. Arnett, Molecular manipulation using a tunnelling microscope, Nature Vol. 331, pp. 324-326 (1988). 6. R. S. Becker, J. A. Golovchenko, and B. S. Swartzentruber, Atomic-scale surface modifications using a tunnelling microscope, Nature Vol. 325, pp. 419-421 (1987). 7. K. Eric Drexler, Machines of Inner Space, in 1990 Yearbook of Science and the Future. Edited by D. Calhoun. pp. 160-177. (Chicago: Encyclopaedia Britannica, 1989). 8. The following papers will be collected and rewritten as parts of my forthcoming technical book: Nanomachinery: Atomically precise gears and bearings, in the proceedings of the IEEE Micro Robots and Teleoperators Workshop (Hyannis, Massachusetts: IEEE, 1987); Exploring Future Technologies, in The Reality Club. Edited by J. Brockman. pp 129-150. (New York: Lynx Books, 1988); and Rod Logic and Thermal Noise in the Mechanical Nanocomputer, in Molecular Electronic Devices III. Edited by F. L. Carter and H. Wohltjen. (Amsterdam: Elsevier Science Publishers B. V., in press). For information on the availability of technical papers, please contact the Foresight Institute.