For most of us, especially given these tough economic times, for the Web to have become the platform of choice for enterprise application development seems like a no-brainer. But to paraphrase The Return of the Living Dead, maybe we need more brains.
Sure, Google has a vested influence in pushing computing into the cloud. Unless your company is also a leading provider of Internet information services, however, you don't have as much incentive.
Web development is popular because it's fast, versatile, and relatively inexpensive -- and it's certainly easy to find developers. But that doesn't mean the alternatives don't have advantages of their own, and in some cases the Web's weaknesses might outweigh its strengths. In the interest of healthy debate, here are five reasons why concentrating your development efforts on browser-based apps might not be the best idea.
1. It's client-server all over again.
Web applications encourage a thin-client approach: the client handles UI rendering and user input, while the real processing happens on servers. What sense does that make when any modern laptop packs enough CPU and GPU power to put yesterday's Cray supercomputer to shame?
Concentrating computing power in the datacenter is fine if you're a Google or a Microsoft, but that approach puts a lot of pressure on smaller players. Scaling small server farms to meet demand can be a real challenge -- just ask Twitter.
Furthermore, security vulnerabilities abound in networked applications, and the complexity of the browser itself seemingly makes bugs inevitable. Why saddle your apps with that much baggage?
2. Web UIs are a mess.
The Web's stateless, mainly forms-based UI approach is reliable, but it's not necessarily the right model for every application. Why sacrifice the full range of real-time interactivity offered by traditional, OS-based apps? Technologies such as AJAX only simulate in the browser what systems programming could do already.
And while systems programmers are accustomed to building apps with consistent UI toolkits such as the Windows APIs, Apple's Cocoa, or Nokia's Qt, building a Web UI is too often an exercise in reinventing the wheel. Buttons, controls, and widgets vary from app to app. Sometimes the menus are along the top, other times they're off to the side. Sometimes they pop down when you roll over them, and sometimes you have to click. That inconsistency hurts your development budget, but it hurts usability more.
3. Browser technologies are too limiting.
What's more, HTML and CSS are clearly deficient when it comes to rich interactivity. Witness the proliferation of multimedia plug-ins such as Flash, QuickTime, and Silverlight. Relying on these outside dependencies increases the complexity and support cost of your applications. Why bother? These tricks wouldn't be necessary if you weren't trying to shoehorn interactivity into the browser instead of sticking to the desktop.
4. The big vendors call the shots.
Recently, Sun Microsystems CEO Jonathan Schwartz described the browser as "hostile territory" for independent developers. It's a world divided between giants, he said, with Microsoft's Internet Explorer on one side and Google's stake in Chrome and Firefox on the other.
Schwartz's statements may be self-serving, but he does have a point. Increasingly, the evolution of Web standards is being driven by major browser vendors -- new features are implemented first and standardized later. Independent developers have little genuine input into the future direction of the Web. And that's to say nothing of the ongoing bickering between the various vendors. Does it make sense to rely on client-side software that's such a moving target?
5. Should every employee have a browser?
At one point, a computer on an employee's desk was for work. Today, every Web-enabled PC is a gateway to shopping, TV and movies, games, music, online chat, and countless other diversions -- up to and including more illicit activities, including porn and copyright infringement -- to say nothing of making them vulnerable to phishing and malware attacks.
You could make a case that it's unwise to allow employees unfettered access to the Web if your company values productivity, particularly in high-turnover environments such as help desks and call centers. But if your internal applications are Web-based, you'll need to either host them onsite or maintain careful router or firewall rules to prevent abuse of your Internet services.
Is the case against Web apps cut and dried? Of course not. For many applications, Web-based development and deployment remains the cheapest, fastest route to market. But in our zeal to cut costs by sticking to Web standards and technologies, it's important to understand the trade-offs, too.