9 lies programmers tell themselves

Confidence in our power over machines also makes us guilty of hoping to bend reality to our code

9 lies programmers tell themselves

Programmers have pride with good reason. No one else has the power to reach into a database and change reality. The more the world relies on computers to define how the world works, the more powerful programmers become.

Alas, pride goeth before the fall. The power we share is very real, but it’s far from absolute and it’s often hollow. In fact, it may always be hollow because there is no perfect piece of code. Sometimes we cross our fingers and set limits because computers make mistakes. Computers too can be fallible, which we all know from too much firsthand experience.

Of course, many problems stem from assumptions we programmers make that simply aren’t correct. They’re usually sort of true some of the time, but that’s not the same as being true all of the time. As Mark Twain supposedly said, “It ain’t what you don’t know that gets you into trouble. It’s what you know for sure that just ain’t so.”

Here are a number of false beliefs that we programmers often pretend are quite true.

Lie No. 1: Programming languages are different

We pound the tables in the bars after work. We write long manifestos. We promise the boss that this time, this new language will change everything and wonderful software will flow from the keyboards in such copious amounts that every project will be done a month before the deadline. In the end, though, we stick data in variables and write some if-then logic to test them.

Programmers see structure in their code and dream of squeezing every last inefficiency from it. So they imagine elaborate castles in the air called “frameworks,” “scaffolding,” “platforms,” or “architectures” and fiddle with them until they offer just the right support to the current problem so that everything can be written in a few elegant lines. Alas, the next assignment has a different structure.

In the end, all of this is artifice and syntactic frosting. Structural liquor that numbs the pain of coding life until it wears off.  Computers are built out of transistors and no amount of clever punctuation and type theory can hide the fact that all of our clever code boils down to one bit of doped-up silicon choosing to go left or right down the fork in the code and there is no middle path.

Lie No. 2: Frameworks are getting better

Perhaps you built your last web application in React because you were unhappy with the pages constructed in Vue? Or maybe you wrapped together a headless Ruby with some static pages built from a templating engine because the WordPress interface was clumsy and dated? Or maybe you rewrote everything in something smaller, newer, or cooler like Marko or Glimmer or Ghost? The programmer is always searching for the perfect framework but that framework, like the end of the rainbow, never appears.

Ralph Waldo Emerson anticipated the programmer’s life when he wrote “Self-Reliance” in 1841. “Society never advances,” he noted, speaking of course of programming frameworks. “It recedes as fast on one side as it gains on the other. Its progress is only apparent like the workers of a treadmill... For every thing that is given something is taken.”

And so we see again and again as developers create new frameworks to patch the problems of the old frameworks, introducing new problems along the way. If a framework adds server-side rendering, it bogs down the server. But if everything is left to the clients, they start slowing down. Each new feature is a tradeoff between time, code, and bandwidth.

Lie No. 3: Null is acceptable

Figuring out how to handle null pointers is a big problem for modern language design. Sometimes I think that half of the Java code I write is checking to see whether a pointer is null.

The clever way some languages use a question mark to check for nullity helps, but it doesn’t get rid of the issue. A number of modern languages have tried to eliminate the null testing problem by eliminating null altogether. If every variable must be initialized, there can never be a null. No more null testing. Problem solved. Time for lunch.

The joy of this discovery fades within several lines of new code because data structures often have holes without information. People leave lines on a form blank. Sometimes the data isn’t available yet. Then you need some predicate to decide whether an element is empty.

If the element is a string, you can test whether the length is zero. If you work long and hard enough with the type definitions, you can usually come up with something logically sound for the particular problem, at least until someone amends the specs. After doing this a few times, you start wishing for one, simple word that means an empty variable. 

Lie No. 4: Computers can capture human choices

The problem of codifying gender and choices of possible pronouns is a big minefield for programmers. Computers deal fixed lists and well-defined menus and humans keep changing the rules. One very progressive school licensed an off-the-shelf application only to discover that the forms gave only two choices for gender.

Computer scientists never really solve problems, they just add another layer of indirection, in this case a pointer to an empty string field where the person can fill in their choice. Then some joker comes along and chooses “his majesty” for a pronoun, which makes some kids laugh and others feel offended. But going back to a fixed list means excluding some choices.

This design failure mode appears again and again. If you force everyone to have a first name and a family name, some will have only one name. Or then there’s someone who doesn’t want to be known by a string of Unicode characters. And what if someone chooses a new emoji for their name string and the emoji doesn’t make the final list of acceptable ones? No matter how much you try to teach the computer how to be flexible and accepting of human whims and follies, the humans come up with new logic bombs that trash the code.

Lie No. 5: Unicode stands for universal communication

There’s an earnest committee that meets frequently trying to decide which emojis should be included in the definitive list of glyphs that define human communication. They also toss aside certain emoji, effectively denying someone’s feelings.

The explosion in memes shows how futile this process can be. If the world finds emojis too limiting, spurring them to turn to mixing text with photos of cultural icons, how can any list of emojis be adequate?

Then there’s the problem of emoji fonts. What looks cute and cuddly in one font can look dastardly and suspect in another. You can choose the cute emoji, and your phone will dutifully send the Unicode bytes to your friend with a different brand phone and a different font that will render the bytes with the dastardly version of the emoji. Oops.

Lie No. 6: Human language is consistent

One of the ways that developers punt is to put in a text field and let humans fill it with whatever they want. The open-ended comment sections are made for humans and rarely interpreted by algorithms, so they’re not part of the problem.

The real problem resides in structured fields with text. When my GPS wants me to choose a road named after a saint, it tells me to “turn onto Street Johns Road.” Road names with apostrophes also throw it for a loop. It’s common to see “St. John’s Road” spelled as “Saint Johns,” “St. Johns,” “Saint John’s,” and even the plural form: “Saint Johns.” The U.S. Post Office has a canonical list of addresses without extra characters, and it maintains an elaborate algorithm for converting any random address into the canonical form.

Lie No. 7: Time is consistent

It may feel like time keeps flowing at a constant rate—and it does, but that’s not the problem for computers. It’s the humans that mess up the rules and make a programmer’s life nasty. You may think there are 24 hours in every day, but you better not write your code assuming that will always be true. If someone takes off on the East Coast of the United States and lands on the West Coast, that day lasts 27 hours.

Time zones are only the beginning. Daylight saving time adds and subtracts hours, but does so on weekends that change from year to year. In 2000 in the United States, the shift occurred in April. This year, the country changed clocks on the second Sunday in March. In the meantime, Europe moves to “summer time” on the last Sunday in March.

If you think that’s the end of it, you might be a programmer tired of writing code. Arizona doesn’t go on daylight saving time at all. The Navajo Nation, however, is a big part of Arizona, and it does change its clocks because it’s independent and able to decide these things for itself. So it does.

That’s not the end. The Hopi Nation lies inside the Navajo Nation, and perhaps to assert its independence from the Navajo, it does not change its clocks.

But wait, there’s more. The Navajo have a block of land inside the Hopi Nation, making it much harder to use geographic coordinates to accurately track the time in Arizona alone. Please don’t ask about Indiana.

Lie No. 8: Files are consistent

It seems that merely remembering the data should be something a computer can do. We should be able to recover the bits even if the bits are filled with many logical, stylistic, orthographic, numerical, or other inconsistencies. Alas, we can’t even do that.

Whenever I ask my Mac to check the file system and fix mistakes, it invariably tells me about a long list of “permissions errors” that it dutifully repairs for me. How did the software get permission to change the permissions for access to my files if I didn’t give permission to do it? Don’t ask me.

These are only two examples of how file systems don’t honor the compact between user (the person supplying the electricity) and the machine (desperate needer of electricity). Any programmer will tell you there are hundreds of other examples of situations where files don’t contain what we expect them to contain. Database companies are paid big bucks to make sure the data can be written in a consistent way. Even then, something goes wrong and the consultants get paid even more money to fix the tables that have gone south.

Lie No. 9: We’re in control

We like to believe that our instructions are telling the computer what to do and that arrogant pride is generally true except when it’s not.

What? Certainly that may not be true for the average nonprogramming saps, unwashed in the liniment of coding power, but not us, wizards of logic and arithmetic, right? Wrong. We’re all powerless beggars who are stuck taking whatever the machines give us. The operating system is in charge, and it may or may not let our code compute what it wants.

OK, what if we compile the Linux kernel from scratch and install only the code that we’ve vetted? Certainly we’re in control then.

Nope. The BIOS has first dibs over the computer, and it can surreptitiously make subtle and not-so-subtle changes in your code. If you’re running in the cloud, the hypervisor has even more power.

OK, what if we replace the BIOS with our own custom boot loader? You’re getting closer, but there’s still plenty of firmware buried inside your machine. Your disk drive, network card, and video card can all think for themselves, and they listen to their firmware first. 

Not only that, but your CPU might have a “Hidden God Mode” that lets someone else take command. Don’t bother looking at the documentation for an explanation because it’s not there. And those are just the problems with the official chips that are supposed to be in your box. Someone might have added an extra chip with a hidden agenda.

Even that little thumb drive has a built-in processor with its own code making its own decisions. All of these embedded processors have been caught harboring malware. The sad fact is that none of the transistors in that box under your desk report to you.

Copyright © 2019 IDG Communications, Inc.

InfoWorld Technology of the Year Awards 2023. Now open for entries!