On Monday, Google expanded its bug bounty program, which the company has used to secure its Google Chrome browser, giving permission to researchers to poke into applications hosted on Google.com, YouTube.com, Blogger.com, and Orkut.com. The invitation is an important acknowledgement that hackers and third-party security researchers are a valuable resource.
"We've seen a sustained increase in the number of high quality reports from researchers and their combined efforts are contributing to a more secure Chromium browser for millions of users," the company says in its blog post.
In the world of desktop systems, anyone can try to find vulnerabilities in a program. While software developers generally throw in the obligatory "no reverse engineering" clause into their end-user licensing agreements, such clauses have not stopped the curious from finding bugs. The result is that companies have had to focus on their products' security. It's no stretch to say that the efforts of hackers are directly responsible for Microsoft's massive push to secure its products and forced other companies -- such as Adobe, Oracle, and Apple -- to follow suit.
Yet the logic that applies to the world of programs running on users' hardware does not apply to Web services running on company-owned servers. When hackers and security researchers have poked into online applications, the results have usually been bad for the researcher.
In 2005, networking and security expert Eric McCarty found a flaw in the online system for submitting applications to the University of Southern California. Using the vulnerability, McCarty was able to access any of the more than 275,000 student applications on the system, including Social Security numbers. But rather than thank McCarty, USC filed a complaint with the FBI, which prosecuted the researcher. Unable to afford a defense, McCarty pleaded guilty and received a sentence of six months of home detention.
Pascal Meunier, a professor at Purdue University, had an experience in 2006 that could have been similar. A student from the professor's computer-security course reported that a physics website had a vulnerability. Meunier reported the vulnerability and the site fixed it. However, when the site was hacked two months later, police scrutiny turned to the professor and his student. The lesson, the professor wrote later, is never to report Web vulnerabilities; instead, avoid the website and delete any evidence that points to a vulnerability.
"The risk of being accused of felonies and having to defend yourself in court -- as if you had the money to hire a lawyer, you’re a student -- is just too high," he wrote.
Google's change of heart may be the start of a trend among Web firms of relaxing their policies toward security researchers -- though Google is not giving unfettered access to researchers to do anything they want.
"Please, only ever target your own account or a test account," the company's security team writes. "Never attempt to access anyone else's data. Do not engage in any activity that bombards Google services with large numbers of requests or large volumes of data."
Moreover, the various caveats -- including "your testing must not violate any law, or disrupt or compromise any data that is not your own" -- suggests that researchers that follow Google's guidelines should still retain a healthy sense of paranoia.