Apple Pay: The 5 lessons it teaches IT and business alike

Tired of being ineffective and unloved? It's time to act different

1 2 3 Page 2
Page 2 of 3

Lesson 2: Think for the long term

If you've been to an Apple event, it may seem as if the new whatever came out of nowhere. In fact, it usually has been in development for years. The iPad project, for example, predated the iPhone project at Apple, though it was released three years later. The necessary actors came together first for the iPhone, so it shipped first.

Apple spends a lot of effort looking at new and emerging technologies, usually while they're still in research and incubation phases. Its own engineers try to figure out both how to exploit and how to improve these technologies. It also has the patience to adopt technology "late" if not all the pieces are ready. That's key: It's not one technology, but usually a constellation of them that's necessary to make the fundamental improvement paying off big.

Remember: Apple was "late" to the MP3 player game, except the existing ones were terrible, and when the revelatory iPod shipped, it ended the rest. Same with the iPhone: In a few short years, it destroyed the old model of the BlackBerry and Palm Treo. (Android leveraged the iPhone's approach, as Windows did Mac OS's, but in both cases Apple still makes more actual money than the higher-market-share competition.)

Too many companies look at technologies when they've become trendy, or they look at them in isolation. The fact that competing phones have had NFC for a couple years is a great proof point: Who cares if they were first? They're not actually used.

IT and business alike should be continuously exploring new technologies and business approaches, regularly investigating what might have use in the future -- and for what reasons. That way, you're more ready to do it right when the time arrives. And you're more aware of ways to make your business better at any time.

Lesson 3: Tackle intractable dysfunctions when the nadir is near

Why Apple did this for payments while the banking, credit processing, and retail industries -- and vendors like IBM, Microsoft, Google, and so on -- have not is a major mystery to me. Except it isn't -- it's the usual case of an industry or company getting comfortable in its incompetence and past methods. Remember all the ERP failures in the early 2000s? They were driven by companies that wouldn't rethink what they were doing and instead used new technology to perpetuate existing bad practices and protect existing fiefdoms. The same often happens in VDI deployments today.

The retailers and bankers all want the other guy to foot the bill for new systems, and they've successfully punted the issue by allowing insecure systems to remain in use for decades (magnetic stripes -- really?) by having us customers pay through hidden fees in every transaction. Frankly, had Congress not started sniffing around the issue after the Target breach, it's possible Apple Pay would not have gained many banks' and credit card processors' quick support.

And retailers still seem unwilling to invest in modern payments technology, as a Reuters report shows. That's because they bear the cost of the technology upgrade, but they've already shifted the cost of fraud to customers via the banks and credit card processors. They have no strong incentive to fix their part of the broken system.

A few dozen retailers, led by Walmart, had been promising for two years their own mobile wallet system called MCX, which will launch as CurrentC next year. It supports bar codes and QR codes to work with much existing equipment, similar to Starbucks' mobile wallet. The app stores no credit cards on the phone, using a connection to the retailer to generate a token that the app presents and the retailer then validates through its payment system. That's similar to Apple Pay but without the tap-and-pay convenience nor the fingerprint sensor validation.

By disassociating the credit card data from the transaction and using disposable transaction-authorization data, schemes like Apple Pay, MCX, and Google Wallet save the retailers from having to implement onerous PCI requirements to ensure that credit card data are kept secure. That should be an incentive for them to adopt such systems, except they need PCI infrastructure in place for old-fashioned credit card transactions that will be around for years, thanks to the years of avoidance, even in the coming chip-and-PIN era.

Thus, you have an industry that has built in the cost of fraud into every transaction, allowing everyone in the industry to avoid doing the right thing. Instead of tackling the fundamental issue collectively, each segment has tried to duck the issue where it could. That's led to a higher price for us all: Customers pay more than they should, and the compliance and recuperation costs every year only climb for retailers and banks.

We're paying more as a result of not seriously tackling the problem than it would cost to solve it. But we pay in drips and drabs, so we can ignore the sad truth. We've reached a point -- thanks to the massive breaches of late -- that the public and government are now really worried, threatening the business of the industry that couldn't get its own act together and instead pretended.

1 2 3 Page 2
Page 2 of 3
How to choose a low-code development platform