But traditions die hard. Some companies simply insist on a bachelor's or even a master's degree because it's an easy way to cut their pile of résumés, or offers a measure of some intangible quality like a deep interest and versatility in working with computers. Whatever the reason, a significant number of people continue to believe that a sheepskin is essential, so developers with an eye on the want ads encounter the dilemma to stock up on diplomas time and again.
[ Want to cash in on your IT experiences? InfoWorld is looking for stories of an amazing or amusing IT adventure, lesson learned, or tales from the trenches. Send your story to email@example.com. If we publish it, we'll keep you anonymous and send you a $50 American Express gift cheque. ]
The practical value of a collegiate degree is controversial. Some find the typical university curriculum too focused on theoretical questions about algorithms to be a meaningful benchmark in the workplace. The professors are more interested in wondering whether the running time can be predicted with a polynomial or an exponential function.
Others believe that this abstract understanding of algorithms and data structures is essential for doing a good job with new challenges. Languages come and go, but a deep understanding lasts until we retire.
Should you specialize or go broad when it comes to programming languages?
A good developer can program in any language because the languages are all just if-then-else statements wrapped together with clever features for reusability. But every developer ends up having a favorite language with a set of idioms and common constructs that are burned into the brain.
But the newer languages are often seductive. Not only do they solve the problems that have been driving us nuts about older languages, but no one has managed to articulate the new aggravations they offer.
Employers are often as torn as developers when it comes to committing to a new language. On one hand, they love the promise that a new programming language will sweep away old problems, but they're also prudent to be skeptical of fads. A technology commitment could span decades, and they must choose wisely to avoid being shackled with a onetime flashy langauge that no one knows any more.
For developers, the best position is often to obtain expertise in a language with exploding demand. Before the iPhone came out, Objective-C was a fading language used to write native applications for the Mac. Then things changed and demand for Objective-C soared. The gamble for every developer is whether the new FooBar language is going to fade away or explode.
Should you contribute to open source projects?
The classic stereotype of open source projects is that they're put together by unkempt purists who turn up their nose at anything to do with money. This stereotype is quickly fading as people are learning that experience with major open source projects can be a valuable calling card and even a career unto itself.
The most obvious advantage to working on an open source project is that you can share your code with a potential employer. There are no nondisclosure agreements or proprietary restrictions that keep you from sending out a pointer to your corner of the project and saying, "I wrote that." Anyone can look at it. If you've achieved committer status, it shows you work well enough with others and know how to contribute to an ongoing project. Those are valuable skills that many programmers never develop.