InfoWorld testing and reviews policy

InfoWorld, in conjunction with IDG Enterprise Test Labs, which is an alliance of the independent product testing and review teams of Computerworld, InfoWorld, and Network World, follows strict ethical guidelines when testing and reviewing products in order to ensure that all results are objective, fair, and accurate.

Compensation

All reviews are based on our own hands-on testing by either staff or freelance reviewers under our editorial guidance. They are commissioned and paid for solely by InfoWorld or other members of IDG Enterprise Test Labs.

We recognize that some of our reviewers may perform private testing or other work for vendors. In order to draw a clear distinction between private testing for vendors and tests commissioned by InfoWorld or IDG Enterprise Test Labs, reviewers who test a product for a vendor are prohibited from testing that product for InfoWorld or IDG Enterprise Test Labs for a period of 90 days. Reviewers with an ongoing business relationship with a technology vendor are disqualified from reviewing any of the vendor's products.

In addition, we prohibit reviewers from accepting any form of compensation from a vendor in the course of reviewing one of the vendor's products. And we prohibit reviewers from direct investment in companies whose products they test and from being otherwise fiscally affiliated.

Acquiring review products

For large-scale comparative tests, we send an invitation letter to all vendors whose products we're requesting. In it, we request the materials we need, describe the test methodology, explain our policies, and set deadlines.

Typically we ask vendors to provide products for review. However, we reserve the right to review products we acquire through other channels, including purchase. If we test a product a vendor has not provided directly, we notify the vendor of that fact.

Once a vendor has sent a requested product, we will review it unless we find technical or additional problems that make it impossible for us to complete the process.

We try to review currently available production code for all products, because it represents what's available to our readers. However, we will look at beta code in the early stages of the review process in order to speed our final evaluation of the gold or released code. When we do review beta code, we will clearly identify it as such.

For high-end hardware and software, we may offer vendors the opportunity to visit the testing site to help us configure their products. If we invite one vendor, we issue the invitation to all participating vendors.

The review process

During the installation and testing process our reviewers maintain an accurate record of all changes made to default product configurations, whether made by the reviewer or the vendor's representatives. If a reviewer uncovers performance results that deviate strongly from our expectations, we will contact the vendor and share those findings.

This gives the vendor the opportunity to verify our findings independently. We want to make sure that any unusual results we uncover can be attributed to the product and not to errors in our testing procedures. Where appropriate, we can include the vendor's feedback in the review to explain unexpected results. However, vendors' requests to see a review before publication are not granted, and no vendor may influence the test results.

We respond immediately to responses from vendors or readers who reasonably question our results or methodology. Accuracy is the primary criterion for every review, and any time we fail to meet that criterion we make the appropriate corrections on our Web site.

If we discover that we've made a mistake during the testing process, we feel obligated to rectify it. That may mean retesting a product, which we will do if the mistake is on our part, as long as the test bench is still in place. (If a vendor disputes our results because, for instance, it sent a hardware unit that was limited compared to other units tested despite having read and understood the proposed methodology, we will not retest a different unit.) We sometimes test using hardware and software loaned by vendors, to which we have access for only a limited amount of time. For that reason, it's important that we receive any inquiries about our results within the week the review appears online.

While we do our best to ensure we're making the most appropriate tests, we can't guarantee that our results mark the only possible assessment of a given product. Users who emphasize uses other than the primary ones we test for may draw different conclusions.

©1994-2014 Infoworld, Inc.