A-Clue.Com will cease in its present form at the end of the year. It will be moving here, becoming a regular weekly feature of the DanaBlankenhorn.Com blog.
To celebrate this change I have five essays on the main topics this blog has covered in its decade of e-mail existence, looking mainly at their present and future. First, last week, was e-commerce, the original beat here. Today I have an essay on Moore's Law. Next week features The World of Always-On, then we go on Political Cycles, before finishing up with a big Internet Future essay.
His statement from last year, that Moore's Law is dead, was narrowly drawn and true as far as it went. Fact is the closer you bring circuit lines together, the bigger the magnetic interference when you run electricity through them. Beyond that there is a theoretical limit – when lines get within 1 nanometer of one another, they cease to be separate lines.
But the Moore's Law Process is very much alive. We see it in hard drive capacity, where disk sizes double at the same price every 18 months or so like clockwork. We see it in optical drives, where every few years features a dispute between manufacturers over technology that's 10 times better than what came before. We see it in optical cable, where the use of different colors lets makers pack ever-more bits down the same pipe. We see it in wireless, where improved digital signal processors enable a 802.11n modem to run 10 times faster than an 802.11b, on the same frequency spectrum.
We even see it at Intel, in the chip sector. The company bit the bullet on its designs two years ago, and now pushes ahead with lower-power chip technologies and multiple processors on a chip. The quad chips of today are indeed four times faster than the single processor chips of three years ago, just as Moore's Law would have predicted.
Moore's Law, as I see it, means that things get faster-and-faster faster-and-faster. You double four to get 8, you double 8 to get 16, and the second doubling gives you twice as much improvement as the first. That's the way numbers work.
Best of all, these variants of Moore's original law reinforce one another. The optical improvements enable cheaper-and-cheaper backhaul for the wireless links. The improvements in flash memory chips force competition on to the makers of hard drives. The use of parallel processing enables thousands of computers to work together, solving big problems no one computer could solve alone.
The laggard remains software. Some 20 years ago we could make new software that would overtax a new generation's chips, pushing consumers to upgrade and maintaining high PC prices. Microsoft is still trying to play that game, but it's getting increasingly difficult. Because consumers don't really need the bloat Microsoft has inserted into its more recent versions of Windows, PC prices have been on a steady decline during this decade. This takes money out of the technology system.
And that is the biggest problem tech has. Balance sheets are being strained at both ends by Moore's Law today. On the one hand hardware prices, and values, continue declining in line with Moore's basic law. On the other hand the cost of producing each generation of chips goes up, in line with Moore's Second Law. This combination has helped push production offshore – offshore to where people make less, offshore to where environmental regulations are just a theory.
Software's recent price compression has a different cause from Moore's Law. Open source is creating viable competition (at a price of free) in operating systems, databases, and server applications (which use databases). This year saw the enterprise space move rapidly into Linux, for functions like resource planning and customer management, things once done by expensive mainframes. Microsoft is being foiled on the high end, and will with the launch of Vista (with its anti-privacy features) face real competition on the desktop for the first time.
Already you may, if you choose, run Linux on some old hardware, add Firefox and Open Office, and pretty much conduct business from your home at a retail cost of nearly zero. This Clue is, in fact, being written using Open Office.
This is why the 2000s have become the era of the device. Devices are priced in the hundreds of dollars, not thousands, and can be easily replaced when software changes. Game machines, cell phones, and iPods are all designed, not to wear out, but to be replaced – software and all – within just a few years. Since they cost just a few hundred dollars – the phones can be had for just the cost of a calling plan – the repair business is replaced by the recycling business.
During the 1990s it was thought that ideas such as Virtual Reality, artificial intelligence and big screen graphics would continue to soak up computing cycles. But the software wasn't any good. Instead we have spent the decade recapitulating what was done before. Most Linux applications already exist in the Windows or Unix world. Cellphones made small files cool again. Only gaming seems to be pushing the envelope, with online games like Second Life and client-based games like Madden NFL Football calling on people to increase their spending and replace old gear with new.
In the end it all comes down to software. Software, like training, does not respond to Moore's Law. It's true that C or Java is a more efficient programming environment than Assembler or Cobol was, but the improvement is not that great. Object oriented programming, in which large programs are built from simpler modules, is failing to give us the improvements we need. The whole business remains complex, requiring extensive training with no assurance (thanks to outsourcing and imported labor) that high skills will deliver high salaries going forward.
We need breakthroughs. We need breakthroughs in order to direct our computing energy toward the enormous problems facing us, the replacement of hydrocarbons with hydrogen, the absorption of vast new populations into the middle class (which will solve so many problems), the continuing effort to save life on this Earth and get us beyond it.
Will we get the breakthroughs we need? I believe we will, because I am an optimist. Studying Moore's Law forces optimism on me. It's a naturally optimistic idea.
But there is no certainty here, as there is with Moore's Law.