On most days, programming is a rewarding experience, with no problem too challenging to solve. Perseverance, intuition, the right tool -- they all come together seamlessly to produce elegant, beautiful code.
But then a botched deployment, yet another feature request, or a poorly documented update with crippling dependencies comes crashing headlong into the dream.
[ Find out which 11 programming trends are on the rise, verse yourself in the 12 programming mistakes to avoid, and test your programming smarts with our programming IQ tests: Round 1 and Round 2 and Hello, world: Programming languages quiz. | Keep up on key application development insights with the Fatal Exception blog and Developer World newsletter. ]
Sure, we might wish our every effort had enduring impact, that the services our apps rely on would be rock-solid, that we would get the respect we deserve, if only from those who should know better. But the cold, harsh realities of programming get in the way.
That doesn't mean the effort isn't worth it. But it does mean we have some hard truths to face. Here are 10 aspects of programming developers must learn to live with.
Language designers argue about closures, typing, and amazing abstractions, but in the end, it's just clever packaging wrapped around good, old if-then-else statements.
That's pretty much all the hardware offers. Yes, there are op codes for moving data in and out of memory and op codes for arithmetic, but the rest is branch or not branch based on some comparison.
Folks who dabble in artificial intelligence put a more mysterious cloak around these if-then-else statements, but at the end of the day, the clever statistical recommendation engine is going to choose the largest or smallest value from some matrix of numbers. It will perform calculations, then skim through the list, saying, "If this greater, else if this greater, else if this greater," until it derives its decision.
For the past 20 years, the word "Internet" has tingled with the promise of fabulous wealth, better friendships, cheaper products, faster communication, and everything but a cure for cancer. Yet at its core, most of the Internet is a bunch of data stored in tables.
Match.com? A table of potential dates with columns filled with hair color, religion, and favorite dessert. eBay? It's a table of deals with a column set to record the highest bid. Blogs? One table with one row for every cranky complaint. You name it; it's a table.
We like to believe that the Internet is a mystic wizard with divine wisdom, but it's closer to Bob Cratchit, the clerk from Charles Dickens' "A Christmas Carol," recording data in big accounting books filled with columns. It's an automated file clerk, not the invention of an electronic Gandalf or Dumbledore.
We see this in our programming languages. Ruby on Rails, one of the most popular comets to cross the Web, is a thin veneer over a database. Specify a global variable and Rails creates a column for you because it knows it's all about building a table in a database.
Oh, and the big, big innovation that's coming 20 years into the game is the realization that we don't always need to fill up every column of the table. That's NoSQL for you. It may try to pretend to be something other than a table, but it's really a more enlightened table that accepts holes.
You might think that the event listener you created for your program and labeled "save" has something to do with storing a copy of the program's state to disk. In reality, users will see it as a magic button that will fix all of the mistakes in their ruined document, or a chance to add to their 401(k), something to click to open up the heavens and lead to life eternal.
In other words, we might like to think we've created the perfect machine, but the users beat us every time. For every bulletproof design we create to eliminate the chance of failure, they come up with one combination of clicks to send the machine crashing without storing anything on disk. For every elegant design, they find a way to swipe or click everything into oblivion.
There are moments when users can be charming, but for the most part, they are quirky and unpredictable -- and can be very demanding. Programmers can try to guess how and where these peculiarities will arise when users are confronted with the end result of code, but they'll probably fail. Most users aren't programmers, and asking a programmer to think like the average user is like asking a cat to think like a dog.
This goes beyond simple cases of user stupidity. No matter how clever your invention or elegant your code, it still has to catch on. Predicting that users will not balk at a 140-character limit for expressing ire and desires is no easy business.
Somehow it feels good to know that your new software can speak XML, CSV, and Aramaic. Excuse me; our implementation team would like to know if this can decode Mayan hieroglyphics because we might need that by the end of 2012. If it doesn't have that feature, we'll be OK, but it will be so much easier to get the purchase order signed if you could provide that. Thanks.
The users, of course, could care less. They want one button and even that one button can confuse them. The wonderful code you wrote to support the other N-1 buttons might get executed when the QA team comes through, but beyond that, there is no guarantee the sprints and all-nighters will have been anything more than busywork and bureaucracy.
Programmers don't even get the same boost as artists, who can always count on selling a few copies of their work to their parents and relatives. Our parents won't come through and run the extra code on the feature that just had to be implemented because someone in a brainstorm thought it would be a game changer.
One manager I know told me his secret was to always smile and tell his team he loved what they were doing, even if it was terrible. Then on the way out the door, he would say, "Oh, one more thing." That thing was often a real curveball that upended the project and sent everyone back to redesigning the application.
Scope creep is almost a direct consequence of the structure of projects. The managers do all of the hard work with spreadsheets before it begins. They concoct big dreams and build economic models to justify the investment.
All the hard work ends once they bring in the developers. Suddenly the managers have nothing to do but fret. Is that button in the right space? Should the log-in page look different? And fretting leads to ideas and ideas lead to requests for changes.
They love to use phrases like "while you're mucking around in there" or "while you've got the hood up." This is what happens to projects, and it's been happening for years. After all, even Ada Lovelace's analytical engine, considered by most to be the first computer program, endured its own form of scope creep, born of nearly a year spent augmenting notes.
There are two kinds of programmers: those who work for bosses who can't program and don't know how hard it can be to make your code compile, and those who work for former programmers who've forgotten how hard it can be to make your code compile.
Your boss will never understand you or your work. It's understandable when the liberal arts major in business development gets an idea that you can't solve without a clairvoyant computer chip. They couldn't know better.
This truth has one advantage: If the boss understood how to solve the problem, the boss would have stayed late one night and solved it. Hiring you and communicating with you is always more time consuming than doing it.
We want our services to protect our users and their information. But we also want the sites to be simple to operate and responsive. The click depth -- the number of clicks it takes to get to our destination -- should be as shallow as possible.
The problem is that privacy means asking a few questions before letting someone dig deeper. Giving people control over the proliferation of information means adding more buttons to define what happens.
Privacy also means responsibility. If the user doesn't want the server to know what's going on, the user better take responsibility because the server is going to have trouble reading the user's mind. Responsibility is a hassle and that means that privacy is a hassle.
Privacy can drive us into impossible logical binds. There are two competing desires: One is to be left alone, and the other is to be sent a marvelous message. One desire offers the blissful peace with no interruptions, and the other can bring an invitation or a love letter, a job offer, a dinner party, or just a free offer from your favorite store.
Alas, you can't have one without the other. Fighting distractions will also drive off the party invitations. Hiding your email address means that the one person who wants to find you will be pulling out their hair looking for a way to contact you. In most cases, they'll simply move on.
The promise of Web 2.0 sounded wonderful. Just link your code to someone else's and magic happens. Your code calls theirs, theirs calls yours, and the instructions dance together like Fred and Ginger.
If only it were that easy. First, you have to fill out all these forms before they let you use their code. In most cases, your lawyers will have a fit because the forms require you to sign away everything. What do you get in return? Hand-waving about how your code will maybe get a response from their code some of the time. Just trust us.
Who could blame them, really? You could be a spammer, a weirdo, or a thief who wants to leverage Web 2.0 power to work a scam. They have to trust you, too.
And the user gets to trust both of you. Privacy? Sure. Everyone promises to use the best practices and the highest-powered encryption software while sharing your information with everyone under the sun. Don't worry.
The end result is often more work than you want to invest in a promise that kinda, sorta delivers.
When you start, you can grab the latest versions of the libraries and everything works for a week or two. Then version 1.0.2 of library A comes along, but it won't work with the latest version of library B because A's programmers have been stuck on the previous big release. Then the programmers working on C release some new feature that your boss really wants you to tap. Naturally it only works with version 1.0.2.
When houses and boats rot, they fall apart in one consistent way. When code rots, it falls apart in odd and complex ways. If you really want C, you have to give up B. If you choose B, you'll have to tell your boss that C isn't a real option.
This example used only three libraries. Real projects use a dozen or more, and the problems grow exponentially. To make matters worse, the rot doesn't always present itself immediately. Sometimes it seems like the problem is only in one unimportant corner that can be coded around. But often this tiny incompatibility festers and the termites eat their way through everything until it all collapses.
The presence of bitrot is made all the more amazing by the fact that computer code doesn't wear out. There are no moving parts, no friction, no oxidation, and no carbon chains acting as bait for microbes. Our code is an eternal statement that should be just as good in 100 years as it was on the day it was written. Yet it isn't.
The only bright spots are the emulators that allow us to run that old Commodore 64 or Atari code again and again. They're wonderful museums that keep code running forever -- as long as you fight the bitrot in the emulator.
For all the talk about the importance of openness, there's more and more evidence that only a small part of the marketplace wants it. To make things worse, they're often not as willing to pay for the extra privilege. The free software advocates want free as in speech and free as in beer. Few are willing to pay much for it.
That may be why the biggest adopters of Linux and BSD come wrapped in proprietary code. Devices like TiVo may have Linux buried inside, but the interface that makes them great isn't open. The same goes for the Mac.
The companies that ship Linux boxes, however, have trouble competing against Windows boxes. Why pay about the same price for Linux when you can buy a Windows machine and install Linux alongside?
Walled gardens flourish when people will pay more for what's inside, and we're seeing more and more examples of cases when the people will pay the price of admission. Mac laptops may cost two to three times as much as a commodity PC, yet the stores are packed to the limit imposed by the fire code.