The ongoing debate over how to best get applications onto mobile devices -- either through native deployments or writing a mobile Web application -- is going to remain a front-burner question for developers to ponder, given the pros and cons of both, developers say.
The native experience is second to none
"The Web and HTML5 have come a long way, but they have not gotten to the native experience -- the UI, the multitouch, what users expect from an application -- yet," says Jesse Newcomer, mobile development manager at Homes.com.
Freelance developer Ketan Majmudar finds problems with the offline nature of mobile Web applications compared to native applications -- applications either have to talk to an online Web service to pull down data or need a data store bundled with them. "HTML5 as a technology is not mature enough yet. It's nearly there, but there's a lot of hoops you have to jump though," such as with data downloading, he says. Native applications, meanwhile, can have data stored in a bundle when an app is downloaded. "The majority of your data is in place."
"Native development will never go away. Objective-C developers will always be required," Majmudar says. Adds developer Paul Nelson, a systems engineer and Web developer at logistics services company Morgan Supply on Demand: "I notice speed and the ability to control memory more when you do native." He says Facebook made a "huge mistake" in creating an HTML5 application for iOS (an effort that did not succeed). "They have the money and the resources to make a native app."
Plus, native development sometimes is just necessary to access certain features, such as the Siri voice-command capability in iOS, says Jonnie Spratley, director of product design at mobile experience provider AppMatrix. "There will always be a need just because of certain features," Spratley says.
HTML5 and hybrid approaches take hold