I was chatting up a Washington liberal today, and it was depressing.
The subject was computing. The liberal bemoaned the power of corporations to wreck a great, highly-functional government project.
The project was starved for funds, its developers allowed to leave, and now its bones were being picked by lobbyists, all aiming their "best of breed" systems as replacements for bits-and-pieces of what had once been a magnificent computing edifice.
Even if Democrats are elected this fall, he said, they don't understand these technical arguments about open source vs. proprietary. They'll be bought off just like the current crop.
Which is when it hit me, the frame he could use to tear down all those vendors and bring back what was lost, what is in the process of being lost.
Open source is parallel processing. (Shown is the parallel processing lab at the University of Utah.)
No matter how big a vendor might be, it's still one system. Like the Von Neumann architectures that dominated computing for its first 40 years they have a bottleneck. The only way to speed up the process of finding a solution is to speed the whole process, get more GHz. It's this kind of thinking which led, by the 1980s, to so-called "supercomputers" like the Cray.
Parallel processing was developed in the 1980s at the Sandia Labs in New Mexico. The idea was simple -- to break jobs into parts, to move the parts onto many systems, and then to put the solutions together on the back end.
In the 20 years since parallel processing has come to dominate computing, relegating Von Neumann to a Wikipedia entry. First people stacked Macs to beat a Cray. Then they used parallel processing on the Internet itself, creating distributed computing projects like SETI @ Home. Today parallel processing is used inside chips -- all today's latest AMD and Intel silicon is doing parallel processing. From two to four to eight -- who knows how far we can go with it.
That's sort of how open source works. Only on steroids.
Because with open source not only do you parse out pieces of a project to different companies, or different developers, but their work can cross-pollinate. Not only can you build systems in parallel, but you can also use a vast community of users to find bugs, and another vast army to stamp out the bugs.
The genius of Linus Torvalds lies in his ability to constantly re-engineer Linux' development process, first farming out all the work, then finding new ways to coordinate the massively-parallel architecture which develops in response. And the design of Linux itself responds well to this parallel processing impulse, since it consists of central functions in a kernel, ancillary functions surrounding it, and a host of distribution providers who can build working systems from all the pieces -- sometimes using just parts of the kernel for a mobile system, embracing optional things like virtualization for a server.
Recent Comments