Last week (OK, I'm running behind on writing up these conversations -- so sue me!), I had a lively conversation with Matt Johnston, vice president of marketing and community at uTest, a company whose tagline is "software testing community." Imagine having a pool of 15,000 QA professionals in 150 countries available on call, with each member having a profile of capabilities, and you'll get a good picture of what uTest is all about.
Since I do a lot of testing of applications that I write or manage, I keep a number (currently about eight total) of desktop and laptop computers around, running a variety of Windows and Linux builds and an assortment of browsers; I expand my testing repertoire with virtual machines. My primary language is English and my primary locale is the United States; when I need to test in a range of languages and locales, I typically have to enlist coworkers with the appropriate language skills who have the operating systems installed to the correct locales, although I can do a certain amount of that myself. Now add mobile devices: ouch. I can really see the need for a QA taskforce like uTest's.
[ Cut straight to the key news for technology development and IT management, with our once-a-day summary of the top tech news. Subscribe to the InfoWorld Daily newsletter. ]
Matt and I spent quite a while talking about QA in Agile environments. I recently did some editing work on a book that's about Agile QA, so I was tuned into the topic. According to Matt, QA has been a bigger pain point in Agile software organizations than in traditional shops; uTest can help quite a bit with this.
Think about a Scrum shop with one inside QA person assigned to a team. At the end of each one- or two-week sprint, the QA person is going to have a crushing integration testing load, even if he or she was proactive in working with the developers on test definitions throughout the sprint. Enter uTest: Instead of working the weekend so that the team can start the next sprint bright and early on Monday, the QA person can define a test matrix and a profile of desired testers for uTest on Friday, post the test build to uTest, and come back Monday to a folder full of bug reports.
That sounds great, but it also sounds like it's either going to be expensive for the company or the testers will have to work for peanuts. According to Matt, that turns out not to be the case; the company can pay for performance by the accepted bug report, and testers with strong reputations receive more work and higher rates.
This is crowd-sourced QA. Interesting, no?