Tens for the Tens: Introduction
Dec. 23rd, 2009 11:56 amOver the next few days, I will be presenting my Tens for the Tens - ten cultural/economic and ten technological/scientific trends or factors (defined broadly) that I think will be important in the 2010s. But in this post, I want to discuss certain topics that I won't, for various reasons, as I will explain, be discussing directly later.
Molecular Nanotechnology
We have an existence for the self-replicating nanomachine - the biological cell. We have an existance proof for the nanofactory - a device that can produce macroscopic objects out of dirt, water, air and sunlight over a period of hours to years (a tree, for instance). Clearly MNT is a hard problem. I do think that in ten years we will be able to point to systems and say that they are unambiguous MNT, but although, as Charlie has recently pointed out K. Eric Drexler pointing out, more MNT-related work is going on than you might think, I don't think we'll have nanomachines, much less personal nanofactories by 2020 (not least because PNFs are WMDs). I do still think that my original timetable for NMT (and AGI) that the decade 2005-2015 would be the decade of research and 2015-2025 the decade of innovation, when we will see the first NMT (AGI) products come to market holds good. I think it will be the 2025-2035 period when things become potentially interesting.
Artificial General Intelligence
As I.J. Good said "the first ultraintelligent machine is the last invention that man need ever make." There is now quite a number of active AGI groups proposing different approaches to the problem and there does seem to be an increase in interest from mainstream researchers and funding bodies. Given that sheep-level AGI would have many interesting applications, much less cat/dog-level AGI or chimp-level AGI (and that leaves aside crow-level AGI, octopus-level AGI, ant/bee-level AGI and (weird) cybernetic AGI), I do think we will see significant investment in this area over the next decade. With NMT, one feels that it is such a hard problem that it will require hundreds of research groups each with dozen of members working for many years to make much in the way of progress, but with AGI , there is always the niggling feeling that it could be cracked by a couple of people in a garage. Or more likely twenty people at JCB. If we do see human-level AGI, we will proceed directly to the Singularity, we will not pass Go, we will not collect £200 and all bets will be off. (There is an argument that it would cost a great deal to train a human-level AGI. Perhaps. But among the first things that you would want the AGI to be capable of would be recursive self-improvement, both at a hardware and software level. Although we can imagine AGIs that as befuddled in most matters as most humans, I think that is unnecessarily mysterian to assume that that is the only possible kind of AGI ("Man, we won't even be able to talk to those crazy AGIs, and they're gonna be so slooowww...") and they wouldn't necessarily stay like that for very long. We could have a bizarre stage of "child" AGI for a few years before Moore's law and recursive self-improvement kicksin . Now, that might would be an interesting time.)
Transhumanism
As Julian Huxley said "'I believe in transhumanism': once there are enough people who can truly say that, the human species will be on the threshold of a new kind of existence, as different from ours as ours is from that of Pekin man. It will at last be consciously fulfilling its real destiny." I think it is difficult to argue with that sentiments. OK, people do argue with them, but I think they are wrong.
I rather like the H+ symbol. It has a pleasantly retro (not steampunk) feel to it. It reminds me of the GE logo. It would I think work well as a lapel bade, although perhaps that is too close a Ayn Rand dollar lapel badge - or an American flag. I think the 10s could the decade in which transhumanism moves into the mainstream, much as feminism emerged in the 1960s. Transhumanism and singularitarianism are, and have been for millennia, deeply embedded in the human project and Francis Fukuyama described transhumanism as the "World's Most Dangerous Idea" in the pages of Foreign Policy in 2004. These ideas aren't going to go away. Which is the point. Transhumanism is not a single idea, it is a label applied to a vast complex of sometimes contradictory ideas. If we want transhumanism to become more mainstream (i.e. to have more influence on the policy on the polity and for there to be more funding for transhumanist projects), we might want to find a different name or at least a way of deemphasising the "self-indulgent, uncontrolled power-fantasies" of transhumanism. It really is about a lot more than that.
The Technological Singularity
As Ludwig Wittgenstein said "What we cannot speak of we must pass over in silence." The nature of accelerating change is that things change slowly until suddenly they change very quickly (my mum has just bought a second hard disk recorder so that the telly in the dining room now has one). The Singularity might be closer than we think. And perhaps this will be the decade that the idea of the Singularity enters public consciousness. Belle de Jour is already hip to it. Ten years from now, the Singularity might be the seen as the number one issue facing humanity. Perhaps. I don't think there will be a cabinet-level Department for the Singularity, but there might be a Institute for Singularity Science at Imperial College. I still think 2035-2045 is a more likely timeframe (perhaps I would, wouldn't I?), but I am prepared to surprised.
Black Swans
The dotcom crash, 911, the Credit Crunch are all about Level 6 Black Swans. I do think there will be a Level 8 (WWI-level) Black Swan in the 10s. I just don't know what it will be. Those of us who have lived in the West over the last 60 years have lived in an exceptional time and place. It has not always been like this and it is not like this for the vast majority of people on the planet. And consider that middle class white boys were conscripted and shipped off by the US and Australian governments to Vietnam in the 1960s and 1970s, so it did happen here. There could be something nasty lurking out a few years in the future. Consider 1914 (from 1909).
A few months ago, the Weasel asked me for a list of Black Swans and these were the ones I came up with off the top of my head in a few minutes, but I am sure there are (many) others.
1. major terrorist attack. One or two orders of magnitude bigger than 911. Possibly a nuclear attack on London or New York from a Pakistani nuke or possibly a Russian Red Mercury bomb. What happens if hafnium bombs are feasible? This will followed by an invasion of Pakistan and the reintroduction of conscription to find an endless war in the mountains of central Asia. Possibly also a major recession/depression associated with the attack as after 911 and extreme civil liberties restrictions. So basically a Super 911.
2. massive climate-change induced crop failures. Mass famines in Africa and Asia. Huge food price rises. Food riots in the West. Depression.
3. nuclear strike. Indo-Pak, Iran-Israel, North Korea-South Korea. Leads to all sorts of kerfuffle and an endless war. Depression.
4. Double dip depression. China implodes under the weight of its internal contradictions. Taiwan declares independence. Rump of China attempts to invade Taiwan. War with the West. Goes nuclear (at least tactically so). Endless war. Depression.
5. Wars with or between various countries. Russia. India-China. Indonesia-Australia. When I was at Siemens, one of my colleagues was ex-New Zealand Army. I asked him what scenarios that had trained for when during the Cold War, expecting him to say defending New Zealand air force and naval bases against Spetsnaz attack during WWIII. We said help the Australians when the Indonesians invade. Population and environmental pressures could lead to large scale population movements. Australia is large and empty and could be very tempting to a beleaguered Indonesian regime. Wars can be driven by combinations of the usual population-environmental-energy-food-terrorism factors. Depression.
7. North Atlantic Drift turns off. European society grinds to a halt. Depression.
8. Cat 6 Hurricane. Direct hit on Miami. Millions dead. Followed days later by Tokyo earthquake. Insurance system is the foundation of the capitalist system - far more important than banks. Global insurance systems implodes. Depression. War.
9. Tunguska-type Event leads to (full scale) thermonuclear exchange.
10. Some disruptive technology having unexpected consequences. Perhaps quantum computers or some kind of cat/chimp-level AGI or nanotech disrupting the established raw materials-manufacturing-retail cycle. And remember that a Personal Nanofactory is a WMD.
11. Singularity (soft or hard) (I don't expect this before, say, 2035).