Measuring the health of the community around a project is difficult because open source is about four freedoms: to use, study, modify, and distribute the software without reference to anyone else. By its nature, proprietary software comes with a meter: The vendor is constantly in your hair, insisting you obtain and track licenses. There's no such mechanism to measure open source software adoption. As a result, you can't find reliable market numbers on the use of pure open source software; all the statistics available are measures of some first- or second-order derivative of adoption. Even these can be hard to obtain.
Many of Sun's product groups took the easy way out and opted to count downloads as their adoption metric. As time progressed, things got out of hand. Marketing groups, seeing their bonuses would be paid based on downloads, started devising programs to inflate download numbers. With advertising, conference giveaways, developer program incentives, and more, they artificially drove the download numbers and won their bonuses. Gradually, that tactic was eliminated, and the metric switched to counting how many times downloaded software "called home" and checked for updates. That was a more reliable measure, but the focus never moved to measuring true community health and growth.
If we'd had tools like Ohloh in those days, I'm sure things would have been different. What matters about an open source project is its community. "Community" is a big word, used to describe both multiple layers of co-development as well as multiple layers of deployment, so really useful metrics for an open source project tell you about the health and development of those layers of community. How many core developers are there -- people who actually understand the code well enough to make substantial changes? What are their motivations for being there, employment or volunteerism? How many companies are employing them? What sort of changes are being made: bug fixes, of course, but what about major new work? These are by far the most important metrics to track, because they predict the future of the code.
By comparison, adoption metrics in general and downloads in particular only tell you what happened in the past. For example, a project like Apache OpenOffice would be expected to have enormous downloads because of its huge global adoption -- in the hundreds of millions. Indeed, that's what has been happening despite the stasis in the project. All it tells us is the strength of the OpenOffice brand, built over a decade of work; it tells us nothing about the project itself. The opportunity is huge, and so is the responsibility to serve those hundreds of millions of users.
Watch out particularly for claims that large download numbers demonstrate "success." Anyone who focuses on download numbers alone has something to hide. If it's a commercial product, like Microsoft Windows 8, ask what proportion of the market is indicated, what this says about the brand, what techniques have been used to artificially push the numbers, and more. For an open source project, ask first and foremost about the developer community -- its diversity, stability, contribution rate, and growth. These are better measures of the things that truly contribute to software freedom growing and spreading in the long term.
This article, "The download deception," was originally published at InfoWorld.com. Read more of the Open Sources blog and follow the latest developments in open source at InfoWorld.com. For the latest business technology news, follow InfoWorld.com on Twitter.