One thing you can say for traditional broadcast media: They scale really well. If you put an analog signal on the air or on a wire with enough repeaters and amplifiers, it will serve every client that connects. That's not the case with most of the network world, unfortunately. Sure we have multicast, but that’s not on an Internet scale -- and the Internet is where the problems lie.
First, let’s define multicast as used in IP networks. This is a method by which a single source stream can be accessed by multiple clients simultaneously, without increasing the load on the source itself. Thus, this functions much like an analog broadcast: You have a single source that a client can connect to at any time. The downside is that the client is a silent subscriber of the content and cannot control the stream; there’s no rewinding or restarting on a per-client basis. This is content broadcast over IP, and it's what television networks use to distribute video streams through their networks, financial institutions to receive stock quotes, and so forth.
On the other hand, the world is rapidly moving to a demand model -- the younger generations are already there -- where streaming content is controlled by the client, and the client forms a one-to-one connection to the content source, versus multicast’s one-to-many approach. If I’m streaming video from a news site, that content is sent to me and only me as it streams. It may be cached somewhere along the way, but ultimately, that stream is unicast and not shared. Also, the content provider must accommodate the bandwidth required for that stream, as well as the resources necessary to deliver it.
Clearly, this isn’t usually a problem. With a suitable broadband connection on a normal day, accessing content around the Internet is a relatively stable and consistent experience, depending on how adept the provider of that content may be in actually delivering the content.
However, if it’s not a normal day, things go south quickly -- and how far south they might possibly go, I don’t think we know. We have never truly seen the impact on our modern broadcast infrastructure of an unexpected event of worldwide significance. I would guess that the Internet itself would be fine, but the content providers would get crushed, which could potentially lead to a cascade of events that effectively amount to an Internet media blackout.
I clearly recall the events of Sept. 11, 2001. I had a huge network forklift overhaul scheduled for that day, and as we started, the world turned sideways. People gathered in front of the only available television and stayed there for hours. Few people, if any, were turning to news websites for 911 updates, and certainly none were loading information on their mobile phones. Universally, the terrible events of the day were carried by broadcast television and radio.
The world is a vastly different place today, geopolitically and technologically. An event of similar magnitude would gather a suitably massive audience, but they would not immediately turn on their television. Rather, they would open an app or browser on their phone, tablet, or computer. Those events would not be broadcast. Instead, they'd be streamed on-demand -- and those streams would eventually fail.
The Internet and Internet services work on economies of scale. Large websites function with the knowledge that they will have peaks and valleys of usage, and if they have 10 million users, no more than perhaps 1 million will be actually engaged with the site at any one time. Never would all 10 million attempt to connect at once. And with the advent of adaptive cloud services, large sites can spin resources up and down to handle the peaks and valleys.
But a singular event that captures the attention of almost every connected human on Earth is a spike we've never seen. It would produce a resource load on large news sites, aggregators, and discussion forums like never before. And now that we have these fantastic cloud services, many of those sites and services will be hosted with the same providers, in the same data centers, all vying for finite resources at the same time. If the event and attention is big enough, it could take down entire providers, which would in turn pull down unrelated sites.
The end result would look like a complete Internet blackout, even if the actual damage were the loss of a few huge cloud CDNs and providers like AWS. At that point, assuming the cable providers can still manage their systems, we might have cable television, and certainly broadcast television would be available. However, the communications resources underpinning the production of content distributed over those mediums would likely be sluggish, if not also down hard. We would effectively return to the days of 2001, and information would spread mostly via broadcast radio and television.
We have built a unicast-centric communications infrastructure in order to deliver astounding functions and services tailored to the infinite needs of Internet users. These infrastructures work extremely well when the world is normal, but when the world tilts, that model may collapse under the weight, especially when only a few large companies provide most of these services.
We need to hope that we have enough time to build out and spread around the underlying resources to a level where this isn’t a threat. We need more competition and broader dispersion of Internet media resources around the globe. Until then, keep a set of rabbit ears and an FM radio handy, just in case.