If we want to rejuvenate the software development jobs market in the United States, we need to radically rethink how we deliver IT education to American students, to say nothing of how we hire IT workers. The Academy for Software Engineering, due to open its doors in New York City in fall 2012, is a step in the right direction.
Most of the really top-notch developers I know became passionate about computing in high school or earlier. Unfortunately, for many of them it was in spite of the schools they attended, not because of them.
[ Neil McAllister reveals the ugly truth behind programmer hiring quizzes. | Speaking of quizzes, see if you can pass InfoWorld's programming IQ test, round 1, and programming IQ test, round 2. | Get software development news and insights from InfoWorld's Developer World newsletter. ]
Meanwhile, demand for software developers is only accelerating. CNN Money ranked software development at the top its list of "Best Jobs for Fast Growth" in 2011. The Dice 2012-2011 Salary Survey lists eight skills that commanded six-figure salaries and experienced above-average growth in 2011; all eight are software development skills. And IT salaries overall are climbing, despite lingering pessimism about the state of the U.S. economy.
That's great news if you're looking for a software development job. It's not so great if you're doing the hiring. Rising wages indicates a severe shortage of skilled tech workers, forcing many companies to outsource software development and IT management overseas. Encouraging more Americans to enter computing fields is the only way to reverse that trend, and investing in education is an important step toward that goal.
Are U.S. workers priced out of the market?
Pessimists argue that it doesn't really matter how many new jobseekers we pump into the U.S. technology market. As long as offshoring allows companies to staff IT positions at developing-nation wages, they say, there's no reason to return jobs to the U.S. But is that really true?
For most companies, hiring domestic employees is eminently preferable to hiring overseas ones. Compared to the outsourcing boom of the early 2000s, most employers have grown much less sanguine about the benefits of offshoring. The problems that arise due to language, cultural, and time-zone differences are well documented. If the raw cost savings didn't seem so dramatic, many companies probably wouldn't even consider it -- and we know now that those promised savings often evaporate once the project is underway.
If offshoring still looks good on paper, it's at least partly because the cost of hiring American software developers is often so disproportionate to the value of the work needed. Not every software development job requires a master's degree in computer science. Most entry-level positions involve little more than rote programming and code maintenance. The fact that the average salary for U.S. software developers is approaching six figures suggests that, at least for some roles, the skills shortage has inflated compensation to unrealistic levels.
Unfortunately, college degree programs have long been touted as the only viable road to a software development career. By the time prospective applicants have completed four or more years of postsecondary education, however -- particularly with college costs on the rise -- they can scarcely afford to enter the workforce at entry-level wages. Where will U.S. businesses find the journeyman coders they need to round out their workforces, if not overseas?
It beats digging ditches (by a long shot)
But let's not be pessimistic. In fact, to many of us, the idea that there is a shortage of Americans who are willing and motivated to do IT work for a reasonable wage seems absurd. Today's teenagers are steeped in computing and the Internet. They play online games, they're on social networks, they're versed in PCs, smartphones, and tablets. Many build their own Web pages, and some go further than that. Are none of these people employable?