First off, let’s look at how big is 26 terabits anyway to gain perspective as right off the bat that’s enough to stream an amazing 700 DVDs per second! So let’s look at it this way also if you will as your home internet connection if a standard “cable modem” set is only about 2.5 megabits per second on a really, really good day (as you share this connection with your neighbors and it depends upon what they are doing as it’s first come first serve). So at this rate we are talking about some 2,500,000 bits every second, now after mega (which is millions), we have “giga” for billions, so it’s not until “tera” were we jump to the “trillions” which is a whole lot of zeros.
However the point isn’t the atypical let’s wow the public with a bunch of zeros, as this experiment was to “wow” with power and its not the massive amount of. Instead it’s how “little” power was used as in the past this amount of data has been sent over fiber optic cables (which is the modern day backbone of the Internet) however it has required massive amounts of power and the problem is twofold as any where there is massive power, “massive” heat isn’t far behind. So it’s this dastardly duo which leads to problems of economics and miniaturization so thinking of it in a cost of power (watts) per bit is a better way of looking at the issue.
As here the researchers have achieve this level of transfer with only one laser rather than multiples as were required in the past to switch frequencies (of light referred to as lambdas). To achieve this the switching which was previously performed in “electronics” is now done with light (photons) which allows for faster “switching” to take place as the light (photons) are not prone to all the issues “electrons” are in the switching world.
However it’s not the purpose of this post to ramble about the nuts & bolts of the technicalities of this achievement, it is to discuss the disruptive impact this will bring as it’s not a linear jump in “abilities” in fact it is a “logarithmic” or “exponential” leap skyward as for data to be of value like any other commodity it has to be able to be moved about as with any commodity to where needed. As think about it this way, the American public has a huge knowledge resource called the “Library of Congress” however it sits all in one big building in Washington DC and if you want to use it you have to go to it. Much like the story of the “Mountain and Mohammed“, its not practical physically nor economically for the public to just pick up and head there. Oh yea, once they got there, they could spend years trying to find what they are looking for, so the value of this data is basically nil.
However what if we where to bring the mountain to Mohammed as in pumping all of this indexed data over a high speed link to your home or place of business? This is true disruptive power as if we look at prior posts regarding “power laws” we see that when one aspect goes up (data transfer rates) what we see is the corresponding cost for the same falls at equal rates in the opposite direction. In short by 2015 we will see a major tipping point in the ability to manufacture and exchange “human” knowledge…