2011 InfoWorld Technology Leadership Awards:Technology Deployment winners
For most technology leaders, the scope of their work is in putting technology in the field to solve business needs -- in other words, deployment. It's the bread and butter of IT. But as common as deployment challenges are, some are simply in their own class due to complexity, scope, politics, and the like. The winners in this category exemplify those in-their-own league efforts.
Matt Larson, VP of DNS research, Verisign: The security flaws in the Domain Name System that DNSSec is meant to fix have been known for a few years, but the Internet is such a large, sprawling, loosely managed network that implementing the solution is no simple task. And that far-from-simple task has been on the shoulders of Larson, who has implemented the DNSSec retrofit to the zones most essential to the functioning of the global Internet: .com, .edu, .net, .gov, and the root zone itself. The effort took several years, with the .com sign-off occurring just this March. We all thank you.
Chris Perretta, CIO, State Street Corp.: State Street has seen its business grow dramatically in Europe, so much so that it needed to add two data centers with all the high-speed connections, disaster recovery, green creds, and high-speed information systems you'd expect of any major financial services firm.
But State Street's deployment challenge was greater than that. Following the vision of the company's executive vice president, Perretta took the opportunity to consolidate all the in-site technology systems -- a mishmash of systems from multiple acquisitions -- into six regional systems, moving thousands of people and the systems they depended on in a complex rollout far beyond the scale of anything the 300 IT staff that State Street assigned to the project had ever experienced. (The financial services firm brought in 300 technologists to design the new systems architecture, deploy it, and manage the transition.) At the same time, the new information infrastructure was supporting 14 additional locations.
Mohammad Rifaie, VP of enterprise information management, Royal Bank of Canada: The new frontier in business intelligence is not mining the data whose meaning you already know in well-ordered data warehouses but in gleaning actionable insight from the "random information" you get from customers in various online channels. This is old news to banks, but the twist here is that Rifaie's team used a combination of big data and semantics analysis tools to take analytics to a new level in service of business growth, rather than continue down the traditional BI path. That meant rethinking BI and the resources -- people and technology -- needed to support the "random information" opportunity.
The system put in place analyzes customer interactions (via email, interactive voice systems, and the like) -- their queries and the dialogs that ensued -- to identify customer irritants the automated systems were not addressing, to better understand trigger events for marketing campaigns, and to surface both problem and opportunity areas that bank managers could focus on to attract and retain more customers. In the process, Royal Bank decreased its need for traditional analysts, as more rank-and-file employees can use the system directly, while broadening its analysis of customer interactions from a minute percentage to nearly all.
Shawn Spott, manager of corporate intelligence and research, RBC Wealth Management: The brokerage firms' executives wanted more insight into their financial operations -- sales and market performance, mainly -- but Spott wasn't sure how to provide it. From his 15 years in BI, he had learned the more data you gave people, the more overwhelmed they got and the less effective it became. Plus, none of the tools he examined claiming to be user-friendly delivered on that promise.
Rather than buy Brand X's whizbang analysis and reporting tool and calling it a day, Spott spent a couple years trying to figure out how to deal with the contradiction between needing to present the key information simply while handling the diversity of information needs that executives had. He spent 18 months alone on figuring out the right presentation layer to address those conflicting goals. Then he did buy a tool -- to rapidly develop his own analytics and reporting front end. With the information architecture, presentation layer, and user information segmentation all in place, he was able to use the Tableau tool to create the reports that executives needed, in a mass-customization approach to software delivery.
That's a very different approach than the typical plan of providing a broad framework for multiple user groups with filters that users then have to figure out to get what they really want. Ironically, it takes Spott's team less effort to support his mass-customization model than the traditional methods. That's because the front end leaves it to users to select their data sources; it handles the integration based on rules that Spott's team derived from all the custom research and reports they used to do -- integration that his group once did manually. Spott now finds he's serving more executives more data than ever before -- but with no firehose effect on users or report-generation treadmill for his staff.