Apple's cautious foray into the wild and wooly world of bug bounties has proved there is more than one way to run a program. Organizations unsure about setting up a bug bounty program should take a look at Apple's model.
At the Black Hat conference in Las Vegas last week, Ivan Krstic, Apple's head of security engineering and architecture, announced the company will pay rewards of up to $200,000 for five classes of bugs in iOS and iCloud. Apple will pay $100,000 to researchers who can extract confidential data from the iOS Secure Enclave Processor, $50,000 to researchers who report code execution flaws that provide kernel privileges or unauthorized access to iCloud account information, and $25,000 to researchers with vulnerabilities that allow a sandboxed process to "break out" and gain access to user data outside the sandbox. The $200,000 maximum reward is reserved for vulnerabilities and proof-of-concept code in the company's secure boot firmware.
"The Apple bounty program will reward researchers who share critical vulnerabilities with Apple and we will make it a top priority to resolve those and provide public recognition," Krstic said at the conference.
There is a key difference between what Krstic announced and how other programs -- such as those run by Google, Microsoft, and Facebook -- work. Apple's invitation-only program limits participation to specific researchers and would be considered a private bug bounty program.
The public programs tend to be free-for-alls, where anyone can submit a bug, leaving the companies to analyze the report to determine whether or not to pay the bounty. This can get overwhelming, especially at the beginning, since there has to be someone -- preferably a team -- dedicated to sifting through those reports to screen out low-quality reports and out-of-scope vulnerabilities. For example, if Facebook is interested in cross-scripting flaws, reporting an authentication-related vulnerability is not within the program guidelines and would require a different response.
That is a time commitment many organizations may not be able to make, and it can pose an operational challenge for organizations starting out with the vulnerability disclosure lifecycle. Excessively high submission volumes would slow down the response process and result in communications delays, which could easily sour the researcher-company relationship.
"Private bounty programs are a prudent stepping stone to launching a public program, allowing companies to 'proof of concept' test their bounty processes," said Kymberlee Price, senior director of operations at Bugcrowd, which runs crowdsourced bug bounty programs for other companies.
If Apple had started off with a public bounty program, it would likely be inundated in short order with a high volume of reports of varying quality. Starting off with an invite-only bounty program makes sense as it lets Apple limit the "noise" of lower-quality submissions that typically accompanies a fresh bug bounty program, as well as giving "hacker allies a head start in collecting these bounties," said Katie Moussouris, founder and CEO of penetration testing consultancy Luta Security.
Apple is initially inviting only the security researchers it knows have the right skills and would submit quality reports, and it is basing its selections on those who have found serious issues and reported them to Apple in the past. It's not a closed program: If someone discloses a vulnerability to Apple and the report is of sufficiently high enough quality, Apple can invite those new researchers to join the program. LinkedIn, Riot Games, and even Tor take this approach to manage their vulnerability disclosure programs and invitation-only bug bounty programs.
"Think of it like a CTF with a prequalification round before your team gets to play in the big competition, or qualifying for a marathon before you get to run," Moussouris said. "Anyone can go for it, but they must prove their skills to be invited into the league that collects bug bounties."
A private program lets organizations experiment with the kind of reward incentives they want to offer, as well as figure out what kind of reporting process they want to have in place. Bounty programs extend the "many eyeballs" concept that is well-known in open source software development, but focuses researcher energies on areas that are high-risk. Rewards are good incentives and channel natural human curiosity into the areas companies are most concerned about.
Apple could have launched its programs on platforms from HackerOne and Bugcrowd to help screen out issues that aren't vulnerabilities or out-of-scope reports. It could have also tapped into the researcher communities associated with those platforms. But being Apple, it's not like the company is hurting for access to qualified researchers.
At first glance, the private bug program sounds similar to the consulting engagements many companies have had in the past (Microsoft famously worked with security guru Dan Kaminsky to look for vulnerabilities in Windows Vista and Windows 7). Apple's program is not a consulting program because researchers will receive rewards per bug, and the amount vary by vulnerability severity. A consultant would typically be paid a rate to find bugs, regardless of the number or severity.
"Despite starting private, Apple has for the first time publicly acknowledged the value of a vulnerability reward program which commits them to its growth and maturation over time," Price said. "This is a big step for them."