Are tribal monocultures the future of software development?

An increasing tendency toward vendor-specific languages suggests a gradual eroding of developer choice

1 2 Page 2
Page 2 of 2

Perhaps the most visible example is Apple. Developers who want to build applications for Apple's Mac OS X and iOS platforms can choose between C, C++, and Apple's pet language, Objective C -- and only the latter can claim first-class status. Early on, Apple insisted it would support Java on Mac OS X as well, but the Java-Cocoa integration effort was aborted in 2005. As for iOS, Apple's stance toward alternative development environments on its mobile platform -- including Java and Adobe's Flash -- couldn't be plainer: It's a no-go.

By effectively restricting developers to a single programming language, Apple achieves two things. First, it guarantees that applications for its platforms will be written to a single, consistent set of APIs, minimizing bugs and other flaws that could lead to crashes.

Second, and perhaps more important, it creates a platform-centric developer monoculture that Apple can leverage to its own advantage. Because there is no Objective C support for the .Net runtime, software originally written for Mac OS X can't easily be ported to Windows, for example. And apps written for the iPhone tend to be exclusive to that platform until their developers rewrite them in an entirely different language, such as Java -- often at considerable expense.

Rise of the monocultures
Apple isn't alone in adopting this strategy. Microsoft developed C# with the .Net platform, and although an open source implementation is available in the form of Mono, C# development remains strongly associated with Windows and .Net. Sun Microsystems took a fairly laissez-faire approach to Java, but Oracle's recent lawsuit against Google suggests a far more proprietary stance. And even Google itself is developing a new language, called Go.

Major IT vendors have always had a hand in developing programming languages. C, for example, was developed at Bell Labs in tandem with the Unix operating system. But while earlier languages were adopted by broad, diverse developer communities, the new breed seem to be more limited in scope. Their aims -- and even their syntaxes -- seem similar, yet they are different enough that they create pocket communities devoted to single vendors. Increasingly, tomorrow's developers won't be C programmers; they'll be Apple programmers (using Objective C), or Google programmers (using Go), or Oracle programmers (using Java).

Microsoft doesn't need to go this route. Windows remains the most popular IT platform on the planet; developers scarcely need more enticement to write software for it.

For now, Schementi says he will remain involved in IronRuby development, as the first non-Microsoft core contributor. What would be even better, however, would be if Microsoft would make a clear commitment to the languages itself, and to dynamic languages on the .Net runtime in general. As Steve Ballmer famously observed, Microsoft owes much of its success to independent developers. It would be a shame if Microsoft chose to limit developers' choices now out of a misguided desire to follow in its competitors' footsteps.

This article, "Are tribal monocultures the future of software development?" originally appeared at Read more of Neil McAllister's Fatal Exception blog and follow the latest news in programming at

Copyright © 2010 IDG Communications, Inc.

1 2 Page 2
Page 2 of 2
InfoWorld Technology of the Year Awards 2023. Now open for entries!