Working in deep IT is a double-edged sword. We know a lot more about the tech that nontechies use daily, but this wisdom can come back to haunt us -- especially when we don't take the time to educate fellow consumers in basic tech facts. After all, widespread confusion around terms and jargon provides great cover for unsavory business practices, most notably in the case of cable and Internet providers. It might help if we could do more educating than eye-rolling in those situations.
The chasm between technophiles and nontechies continues to grow unabated, and at the bottom of that chasm, there sit basic disparities in understanding, most notably in terms and units of measurements. For us in the know, technical terms have very explicit definitions; for others, not so much. Most of the world is familiar with the term “gigabyte” but has no real reference for the word. It’s up there with light-year and league as a known unit of measurement, but without real context. Heck, for many, it’s synonymous with “gigabit” and “modem.” This leads to problems in basic communication.
I read a story last week that breathlessly described a “breakthrough” that “enables downloads 50,000 times faster than ‘superfast’ broadband.” I get that this is click bait, but it serves to illustrate my point. The article went on to describe a multiplexed connection utilizing 15 fiber strands and compression to achieve 1.125Tbps throughout. It had absolutely nothing to do with broadband, yet the article took pains to measure this lab achievement against consumer broadband speeds and mentioned how you could download the entire "Game of Thrones" series in “a fraction of a second.”
Leaving aside all of the technical issues with that last statement, such as writing 1.125Tbps to storage, I understand why the article is presented this way. If the headline simply said researchers were able to simulate data transmissions at 1.125 terabits per second, most people would nod, then click to some other website. If you give them a familiar handle to grasp, such as a comparison to their own Internet connection, then eyebrows may raise, and some knowledge may be transferred.
Some units of measurement should be known and understood by everyone. The metric system, the imperial units used in the United States and the United Kingdom, the standard units of time, the calendar, units of currency -- we are all expected to understand these completely.
When someone says they drove for an hour and went 50 kilometers, we don’t expect that they possibly misunderstood those units and instead might have actually been driving for two days and 3,000 kilometers. If you order a pint at the pub, you don’t expect a barrel. Nobody routinely confuses $1 bills with $100 bills. However, a large number of people do not readily distinguish megabits and megabytes, and they routinely confuse storage with memory.
To be fair, at levels outside of our normal operating range, we sometimes fail to grasp the significance of figures in even common units of measurement. Explaining how much money $1 billion really is to someone who has never had more than $1,000 is a challenge. Describing Planck time is difficult even to those who understand what a meter is and what a second is. When you get to the edges, the picture can get squirrelly.
But to not understand normal units of measurement in bandwidth and storage such as gigabytes and megabits is not the same. These measurements aren’t extremes -- they’re commonplace. To fail to understand these concepts has repercussions beyond getting a phone without enough storage or suffering a computer that runs extremely slowly because the RAM is woefully undersized.
This lack of understanding causes bigger problems. I’d wager that the true story behind the claims of Time Warner and Comcast that people don’t want gigabit Internet speeds has something to do with this. Fundamentally, most people can’t truly understand what gigabit Internet is. You might as well ask them if they want a car that gets 50 rods to the hogshead with a maximum light speed of 8.9e^8.
So while the bombastic article touring the new technology that could download "Mamma Mia" in one second causes us to roll our eyes, it’s useful in the broader sense. While not technically relevant, it provides at least some form of comparison for those who couldn’t otherwise understand.
We, as techies, should be doing our part by patiently correcting megabit/megabyte errors in conversation and providing basic comparisons. An average Blu-ray movie is roughly 5GB, so that would take roughly 43 seconds on a gigabit Internet connection. On a megabit connection, that’s just shy of 12 hours. At 25Mbps -- what is considered “broadband” in the United States -- that’s about 30 minutes.
Those are real-world numbers measured against well-known units of measurement that may help the translation the next time you’re in this conversation. In fact, I wrote this column after a cocktail party discussion with a very successful and well-educated person who sheepishly admitted that they had no idea what a "megathing" was.
After all, if more people truly understood how their ISP is rooking them on Internet bandwidth, we’ll all benefit sooner rather than later.