Sometimes there are things which seem to have no connections, however when you start to peel the onion back you find there is far more correlation there than meets the eye. One of these is the Top 500 Super computer list, and while my main stream employment happens to be in the technology field it would make some sense to take a peek at this list. Second, HPC (High Performance Computing) has always been a passion of mine so this was a double down to at least passively watch the players come and go from the list. For those normal folk with only a little “geek-dom” this is a roster of the top 500 most powerful computer installations in the world which is managed by University of Tennessee professor Jack Dongarra who has the task to oversee the list.
So outside of general bragging rights what makes this list so important you might be asking. Well the first thing is HPC installations have only one purpose and that is to solve very complex problems, as these are not the systems your bank would use to keep track of your checking account nor host your favorite streaming web site. These systems have the sole purpose to understand the “complex” as in model global weather patterns, find oil in petabytes of seismic data, unravel the human genome and so on. These are the modern day industrial engines of nations allowing us to understand and achieve things simply not possible or practical in the physical world.
With this said, its then important to see which countries are on the Top 10 list and as of June 2011 the United States holds the most spots with 5, followed in a tie between Japan and China and bringing up the rear is France. So while this bodes well for the US thought complex, the other piece is the coveted top spot which is currently held by Japan and it looks like they will hold this position for a year which is far longer than most even if measured in dog years when we speak of computers.
However it’s not so much the fact that Japan has held this position for so long (as rumor has it IBM and Cray are building 20 petaflop machines for the Department of Energy, and these should go live next year). It that we appear to be hitting a sigmoidal curve in the growth of computing on dual fronts, as the first is power. As the Japan behemoth consumed 9.89 megawatts when it reached a peak of 8.162 petaflop! Ok, this is enough electricity to power up almost 10,000 US homes with an electric bill of nearly 10 million dollars. As you can see, this is one of the main reasons the constructions of these beasts are so closely linked to nations as even large corporations lack the checkbook to feed these monsters on a steady bases.
As while we have the technology to go higher (computation wise) meaning we can solve more and greater problems in complexity, power is now a limiting factor. Second in this mix is “software” as where the body has the might, the brain lacks the fight as we humans can’t seem to keep up with adapting software to the challenges as you can have all the transistors in the world, yet if you lack the ability to tie it all together for a value added outcome, its pretty much for nothing.
In summary the 1900’s marked an era where industrialism was governed by the locomotive’s ability to move raw materials, to production facilities. Whereas today, it’s the ability of the transistor to solve complex problems, and anything which impedes that also impedes industrialism…