Can software development get its groove back?

Coders have gotten a bum rap for too long -- and if the trend doesn't reverse, it could spell bad news for U.S. industry

This week, President Obama reiterated his plans to strengthen the American higher education system by bolstering the nation's two-year community colleges, including new programs backed by a $35 million grant from the Bill and Melinda Gates Foundation. "The goal is to ensure that every state in the country has at least one strong partnership between a growing industry and a community college," Obama said at a meeting of the President's Economic Recovery Advisory Board.

No industries could benefit from such partnerships more than IT and software development. Community colleges are in a unique position to lay the foundation for advanced coursework in information systems and computer science.

[ Also on InfoWorld: Bob Lewis sees another divide in the IT job market in "Feast or famine? The IT job outlook" | Keep up with app dev issues and trends with InfoWorld's Developer World newsletter. ]

Unfortunately, recent trends paint a grim picture. According to DARPA, the research and development office for the U.S. Department of Defense, there were 45 percent fewer enrollments in CS degree programs in 2006 to 2007 than in 2003 to 2004. The trend is even more dramatic for women; according to the Computing Research Association, women accounted for just 12 percent of CS undergraduate degrees in 2006 to 2007.

It seems clear that where computing was once considered a promising career, it has increasingly become a niche occupation in recent years. IT and CS degrees have lost their luster, and no amount of money thrown at the problem is likely to reverse the trend. DARPA goes so far as to say the decline in engineering education "affects our [nation's] capacity to maintain a technological lead in critical skills and disciplines." The question is, how did we get here and what can be done about it?

Programmers: Crooks or misfits

The media must take some of the blame. The vast majority of software engineers are creative, enterprising professionals who clock in at completely ordinary, yet essential jobs. Nonetheless, the media maintains an obsession with the idea of computing professionals as inscrutable, maladjusted outcasts.

The most famous example of coder culture in mass media is, of course, "The Matrix." Although Keanu Reeves's character Thomas Anderson is described as a programmer, it's clear from the get-go that his job has little to offer him. By day he bides his time under fluorescent lights in an endless, impersonal cube farm. By night he's Neo, a hacker who earns extra cash by selling malware out of his squalid apartment to the seedy denizens of dingy goth clubs. In other words, as a programmer Anderson's rung in society is roughly equivalent to that of a meth dealer.

By the time Anderson hooks up with a revolutionary underground, we do meet some other programmers with purer motives. But in this universe, "programming" involves the preternatural ability to see the patterns in green-screens full of gibberish. It's not so much problem-solving as reading tea leaves. What's more, we're never told where one might acquire this unlikely skill. Are there night classes? Judging by the messianic theme in "The Matrix," you presumably have to be born with it, like some highly evolved version of Asperger's syndrome.

The media loves these kinds of themes for programmer characters. In William Gibson's pioneering sci-fi novels, "cyberspace" was an all-day escape for social misfits. In the movie "Hackers," it was an outlet for the rebellious impulses of disaffected teenagers. In "The Net," Sandra Bullock's character had no relationships that weren't online or over the phone.

Similarly, in fiction a programmer's highest calling is the ability to break into other people's computer systems, as if -- Stuxnet worm aside -- there's much of a career path in black-hat hacking. Even then, the character is almost never depicted as inventing anything or coming up with novel exploits. Evading computer security is simply a matter of having the right program for the job, like owning the fastest car in a drag race.

I remember looking forward to reading Ellen Ullman's novel "The Bug" when it appeared in 2003. Penned by a "computer industry pioneer," the book was billed as the first true-to-life literary novel about computer programming. It focuses on a hapless developer as he vainly tries to track down a glitch in his code. As the plot progresses and the bug continues to elude our developer, his mental state unravels. His girlfriend gets fed up with his obsession and leaves him for a smooth-talking, bidi-smoking refugee from Burning Man. Now totally forlorn and still unable to track down the bug, the programmer commits suicide. The end.

What a letdown! I can understand if the media doesn't get IT and software development, but is this really the best picture we can paint of ourselves?

What would Rodney Dangerfield say?

Naturally, authors must employ a certain amount of dramatic license when crafting stories about computer programmers. The realities of the office don't necessarily make for great fiction, even when there's a pool table and a fridge full of free Cokes. Still, it might be a lot easier for authors to find entertaining, affirmative stories to tell about computer professionals if the narrative for real-life developers in the United States was more positive or more likely to have a happy ending.

Today's professional programmer is in quite a bind. Routine coding jobs are increasingly being outsourced. Coders are told they should acquire management skills to assure themselves a place in the new economy -- but how does that help the recent CS graduate, when all the entry-level jobs have been shipped overseas? No matter how much we invest in education, there's no clear connection between a CS degree and kind of midlevel management positions that offer hope of a stable career.

Even that small hope might be in vain. Google famously favors candidates with advanced degrees, but some employees worry that even after investing heavily in education, their careers at cutting-edge Web firms may be finite. Just last month, the California Supreme Court ruled that former Google employee Brian Reid is entitled to a trial on his claim that the company practices age discrimination, a complaint that has been widespread throughout the tech industry.

The uncomfortable truth is that the future for software developers may be closer to the bleak image portrayed in the mass media than we might like. On the one hand we have Neo in his cube farm, hoping his department isn't next to get the axe. On the other we have the kids from "Hackers," who know better and for whom programming is something you grow out of after high school (presumably before going on to get a law degree). Somewhere in the middle is Sandra Bullock in "The Net," struggling on her own as a freelancer, who's either incredibly lucky or maybe just some kind of freak.

President Obama wants to create partnerships between colleges and industry. Grants for education are a start, but software developers need more. They also need partnerships between employers and the workforce, where companies stop feeding into negative patterns that poison the labor market and undermine efforts to encourage CS education.

Real software development isn't about breaking firewalls and late-night hacking sessions. It's a critical discipline that calls for highly skilled, motivated professionals. In an economy where traditional manufacturing has all but disappeared from these shores, software development may be one of the last great American industries. It's time we treated it as such and gave software engineers the respect and encouragement they deserve.

This article, "Can software development get its groove back?," originally appeared at Read more of Neil McAllister's Fatal Exception blog and follow the latest news in programming at

Copyright © 2010 IDG Communications, Inc.

InfoWorld Technology of the Year Awards 2023. Now open for entries!