Who can we blame for all this?
Molecular manufacturing enthusiasts trace the idea to 1959 and Nobel laureate Richard Feynmann's famous lecture "There's Room at the Bottom". In it, he suggested that it ought to be possible to rearrange atoms "the way we want…all the way down." Far enough down, he said, "all of our devices can be mass produced so that they are absolutely perfect copies of one another." (This idea raises the possibility of hardware-sharing wars far worse than today's copyright battles.) "The principles of physics, as far as I can see, do not speak against the possibility of manuvering things atom by atom." The problem: our fingers are too big.
The next step wasn't until the early 1980s: Eric Drexler, in his popular book Engines of Creation. Drexler posited the idea of general-purpose molecular assemblers and also the problem of "grey goo" which, Phoenix says ruefully, "is still haunting the industry". If a really small manufacturing machine escaped and fed off the biosphere, sucking up chemicals it wasn't originally designed for, it could turn the world into an amorphous grey mass. Drexler was thinking in biological terms: bacteria are very inventive, and an invasive species becomes more so if it has no predators. Drexler went on to write the more technical Nanosystems in 1992, though "it was ignored outside of the community".
The word nanotechnology became coopted to describe nanometer-scale polymer science and other areas. Meanwhile, thinkers like the science fiction writer Vernor Vinge and Ray Kurzweil surmised that humans and artificial intelligence would merge to become something beyond our current comprehension on the other side of a moment Vinge dubbed the Singularity.
By the mid 1990s people were talking about nanomedicine. Still, in 1997, when James van Ehr, CEO and founder of Zyvex, used some of the $100 million he made from selling a company to Macromedia to found a nanotechnology company, a professor he consulted burst out laughing. But by 2000 nanotechnology, in its less far-out meaning, was becoming mainstream. The US government allocated $1 billion a year for research under the National Nanotechnology Initiative, none of it for molecular manufacturing. (The EU also has a plan (PDF)). And then a sort of disaster struck: in 2000 Sun Microsystems' Bill Joy published Why the Future Doesn't need Us in Wired, in which he suggested that molecular manufacturing could destroy the world and should not be invented. It's only now, says Phoenix, that the influence of that article is passing enough for people to be able to admit again that they're interested in researching this field.
Building utopia, atom by atom
Let's say molecular manufacturing is going to happen. When? And with what consequences? Phoenix thinks we could have nanofactories by 2022, leading, over the next five to seven years, to a brain-machine interface and, given the raw materials, planet-scale engineering. Sooner, he thinks, is better: if it's delayed until after 2025 the related technologies could be so powerful the whole thing will hit like a tidal wave.
Brian Wang, a futurist and member of the CRN taskforce, has a more detailed set of economic projections as Moore's Law accelerates and extends outside computing and China's economy passes that of the US (which he dates to 2018, plus or minus three years). Wang puts the development of molecular manufacturing at 2015, despite road blocks in the form of energy (which he thinks it will take decades to solve) and conquering space (still hard). But a 1kg nanofactory could, if supplied with enough feedstock and energy, make 4,000 tons of nanofactories and 8,000 tons of products in a single day – making it possible to replace or upgrade more than our current production capability in weeks to months. It will bring with it long-term acceleration of economic growth: wealth for all.