8 horrifying Hollywood computing cliches

We've all rolled our eyes at ridiculous misinterpretations of computer technology on TV or in the movies. These eight seem to pop up again and again

8 horrifying Hollywood computing cliches

Let’s take a break from data science and application development to talk about a vastly more important topic: TV and movies. Ever notice how badly Hollywood does computing? Here are eight “computer as a magic box” plot devices that drive me nuts.

1. Waiting for the image to resolve

One of my favorite series, "Battlestar Galactica," was one of many shows to use the old “resolving the image” ploy as a plot device. While technically "Battlestar" takes place many years ago far away from earth, the computers seemed very close in capability to the ones we have today. It has been many, many years since computers took significant time to “resolve an image.”

What does resolving an image even mean? Well, you can sharpen, mess with contrast, and perform various color corrections. You might have algorithms that do these things more selectively, but all of them take seconds at the most. In crime shows, another frequent device is to “have a buddy in the FBI [or NSA or CIA or wherever]” enhance images taken with really old security cameras in convenience stores. Except that's nonsense: If the camera didn’t capture the data, it isn’t there.

2. Copying is the same as moving

In one episode of "Star Trek: Voyager," the holographic doctor was saved on a backup. Yet in every other episode, to increase the drama, sending him to any other computer deleted him off of the source. This is a frequent TV trope. When they sent him across the alien network to the Alpha quadrant, OMG, you were supposed to worry that his bits might be lost forever! But logically, the worst thing that might have happened was that he would lose his new memories because they should have been able to restore him last night’s snapshot. Why wouldn't you make a copy of such a valuable program?

3. Let me hack that for you

In my misspent youth, I might have broken into a few computers and even altered a grade ... once. However, I didn’t simply start typing and trying password combinations. Sure, there are a few obvious ones, but actually cracking into a computer network isn’t usually quite so instantaneous or easy as it is on every show from "How to Get Away with Murder" to "24." When the geek “tries real hard” to crack a new computer system and manages to do it in record time, blow a raspberry at the writer and yell, “Deus ex machina, literally!”

4. Open a socket

WTF was Chloe O’Brian in "24" talking about when she said “open a socket?” Hell, WTF were they talking about most of the time? That show was terrible all around, but for whatever reason that annoyed me the most.

5. It catches fire (literally)

I once accidentally watched an episode of "NCIS" where a virus made a desktop computer catch fire, so a stereotypical goth geek put the computer in a freezer in a morgue to cool it off. Likewise, a key point of "Digital Fortress" was that software made the NSA supercomputer catch fire. Yes, I know, "Digital Fortress" is a book by Dan Brown (an awful writer who writes books that all have the same plot as an episode of "Scooby-Doo") and yes, bugs happen -- but no processor made in the last 20 years lacks a temperature failsafe, and even if it did, it would stop operating long before it burned up. I’m sure "Digital Fortress" will be made into a terrible movie some day because it contains a number of other ridiculously impossible events.

6. Encryption 

Another "Star Trek: Voyager" thing: Someone says, “Computer, encrypt the controls,” which for some reason means the next person can’t say, “Computer, decrypt the controls since you have the key anyhow.” In other cases, encryption seems awfully weak. It can be so easily broken that it fits within the plot line of a single episode in most TV shows. Even broken encryption algorithms usually take longer than that. What are they using? Enigma?

7. The Mainframe

Most TV shows talk about a “server” or “the really important computer” as “The Mainframe,” from "Star Trek" to "Person of Interest." I’m beginning to think that IBM pays for this. Seriously, most powerful computing systems aren’t actually mainframes. The latter sequels in the "Terminator" series weren't very good, but I liked that Skynet became a distributed system deployed on a botnet -- not on a mainframe.

8. Everything is compatible

I hate most modern action movies. The sensory overload of continuous explosions, with no real reason why, is too much for me. I probably wouldn’t see the new "Independence Day" for that reason alone. But much worse than things that go boom, the original had a virus uploaded from a Mac to an alien computer. That was enough for me. I walked out of that one.

Now you tell me: What are your “favorite” reasons to yell at the screen while your family and friends shake their heads at the crazy person and say, “It’s only a movie?” What have I missed over the years?

To comment on this article and other InfoWorld content, visit InfoWorld's LinkedIn page, Facebook page and Twitter stream.
From CIO: 8 Free Online Courses to Grow Your Tech Skills
Notice to our Readers
We're now using social media to take your comments and feedback. Learn more about this here.