How We Test
PC World conducts two major forms of testing on all products under consideration for review: hands-on testing by editors and writers who are experts in the product category, and formal testing by PC World Labs.
The following is a synopsis of the tests that PC World Labs and PC World editors perform for major product categories. We will add other major categories soon, including our procedures for testing televisions.
Point-and-shoot models and advanced models: We test all cameras on default settings with their included memory. If no cards are provided, we use a Kingston brand memory card. To gauge picture quality, we take a series of indoor shots, with and without flash, at the camera's highest resolution. We photograph a complex still life, a target resolution chart, and a mannequin to see how well each camera captures subtle details such as color accuracy and skin tones. Next, we shoot a series of images at the highest resolution as we increase the camera ISO setting from 200 to it's highest setting, in order to test the camera for image distortion. We then record two test videos at maximum quality and resolution, one in a fully lit (5000K) room and one in a low-light room. To evaluate the images our panel of judges first examine the ISO noise test shots on a color-calibrated monitor, then calibrate a FujiFilm Pictrography 3500 printer and print the images on 8x10 photo paper. Our judges review the photos and assign image-quality scores; we then average those scores to determine a final "Lab Tested" verdict. We also test battery life for these models; half of the shots use the flash and half keep the flash off. Zooming and powering the camera on and off are also part of the battery life test. Our battery test has a maximum count of 500 shots.
Although point-and-shoot cameras and advanced cameras may differ in function and build, we test them using the exact same settings. The differences between the two categories in regards to, for example, image quality and/or manual controls appear as bonus points for the scores of the advanced models.
Single-lens reflex models: We test all cameras with their included memory card. If no cards are provided, we use a Kingston brand memory card. To gauge picture quality, we take a series of shots, with and without flash, at the camera's highest resolution. We photograph a complex still life and a mannequin using automatic settings in Program/Full-Auto Mode to see how well each camera captures subtle color and exposure under its default settings. We then photograph the same still life and a resolution moiré chart with semiautomatic settings using aperture priority, custom white balance, and exposure bracketing. We pick the best shots of each of those two subjects for judging. We also test the camera's capability for minimizing noise using a range of ISO settings starting at 200 and going up to the camera's maximum setting. To evaluate the images our panel of judges first examine the ISO noise test shots on a color-calibrated monitor, then calibrate a FujiFilm Pictrography 3500 printer and print the images on 8x10 photo paper. Our judges review the photos and assign image-quality scores; we then average those scores to determine our final "Lab Tested" verdict. We do not test battery life for this type of camera.
We base the image-quality rating of the camera on five categories: exposure, color, sharpness, distortion, and overall.
We perform no lab-based tests on cell phones. We consider a number of factors in assigning the final verdict for a particular model of phone, including its hardware design, call quality, input options (how well the keyboard works), software, multimedia functions, and camera image quality; we also assess what comes in the box.
For multimedia-equipped phones, we use the handset to take pictures both inside and outside. We then transfer the files back to a computer and pull them up in an image editing program. We look for the quality differences that separate sharp, solid photographs from fuzzy, unpleasant images. This process can include evaluations of a picture's graininess, color accuracy, noise, and blurriness, to name just a few characteristics--the same holds true for phones that can take video as well.
Desktops and Laptops
Contemporary computer benchmarks fall into two distinct camps: synthetic and nonsynthetic. Synthetic benchmarks are specifically designed to run tests that don't necessarily reflect real-world use of a system. That's why PCWorld turns to its WorldBench 6 platform for all system testing. This nonsynthetic, real-world benchmark measures a system's performance by using everyday programs to generate measurable results, be it the time it takes to compress files, to run a series of Photoshop commands, or to encode movies.
WorldBench 6 uses automated test scripts across eight different applications to simulate the real-world use of a system in a measurable context. The better a PC is at various tasks associated with the programs, from loading Web pages to running Office commands to encoding video, the higher the overall score the system receives. After running the full WorldBench 6 suite multiple times in quick succession, we average the results together to create a final score, which then factors into an editor's overall rating for a desktop PC.
We use repeatable demos in Unreal Tournament 3 to determine the graphical prowess of a particular system. We run the game at four resolutions (1024-by-768, 1680-by-1050, 1920-by-1200, and 2560-by-1600) in both normal and high quality modes, and at the end of each run we record the system's average frame rates per second.
When reviewing laptops, we use the laptop as a primary machine over the course of several days. This includes using the laptop's bundled software, listening to audio and video files of different types and quality to discern any flaws in output, and streaming media from an assortment of wireless devices in multiple locations to detect any playback issues. In short, we try to simulate exactly how an average person would interact with the device, from carrying it around and surfing the Web in a coffee shop to parking it at a desk and editing photos.
We also run the laptop through a battery test, which logs time as the laptop cycles through MPEG-2 videos and simulated typing until the battery dies. We run these tests with any included Wi-Fi turned off and the laptop's display set to medium.
We use two identical test beds to run storage testing, based on identical drive images that we reload for each new storage device. When testing internal hard drives, we set up the drive in question as the machine's single primary hard drive, and we load the operating system images onto it.
Each test system is configured with a RAM drive that serves as the basis for read/write tests when we evaluate a new product. Otherwise, performing tests from a slower hard drive to a faster, to-be-reviewed drive would show a bottleneck that would incorrectly bring down the results of the drive under evaluation. We copy large files as well as smaller collections of folders and files from the RAM drive to the storage drive, and vice versa, to establish a read and write score based on the time elapsed. We also run a malware scan using the command-line application a2cmd, part of the A-Squared Anti-Malware and Free packages, as well as an installation and uninstallation of OpenOffice.org. We also test internal drives with two hard-disk-intensive application tests from WorldBench 6, Nero and WinZip. We time all processes in seconds.
For all drives, we also measure the power usage of the products in three separate phases: during the tests, in the idling state, and in the sleep state. For drives powered by the system (internals, solid-state drives, and portables), we measure the power usage of the entire system. We test solid-state drives as if they were internal hard drives, with no additional tests performed save for those described above.
We start by placing all monitors a minimum of 18 inches away from any other display that is powered on, with lighting provided by several different banks of daylight-balanced fluorescent lamps. We attach all tested monitors to the same system for benchmarking, and we reset all LCDs to the vendor's default settings (on a digital DVI signal, if possible) prior to calibration. We calibrate each monitor using a Gretag Macbech Eye-One Display 2 colorimeter with Eye-One Match 3 calibration software.
Three editorial staff members come together to form a "jury" for the evaluation process. A test monitor is left on for at least 30 minutes prior to the actual tests, which we judge on a five-point scale ranging from "Superior" to "Poor" quality. We use four images to test contrast, grayscale performance, photographic color accuracy and detail, and skintone replication. In addition, we run a series of motion tests to check for flicker or ghosting, we evaluate dead pixels based on a series of screens of uniform color, we check the display's maximum viewing angles, and we determine whether an attached Mac Mini can identify the display's native resolution.
We also test a monitor's power consumption over three different states: on, idle, and off. The "on" test measures the display's watt-hours during a 3-minute slideshow of full-field black, white, red, green, and blue colors.
We test printers on both the PC and Mac platforms for the number of pages per minute (ppm) they can produce for text, and when appropriate, the pages per minute they can produce for color graphics and photos. We also assess their text, line-art, color-graphics, grayscale, and color-photo quality. For speed testing, we time the printer "from click to clunk"--that is, from the time we hit the Print command to the time when the last page of the job comes to rest in the output tray. We also measure the "first page out" time, which is the time the computer and printer take to process and print the job.
We test for speed and print quality using a variety of monochrome and full-color documents. The monochrome documents include a ten-page text document, a three-page text document that contains images and different-size fonts, and a monochrome photo. The color documents include a two-page spreadsheet, a one-page print from a Website, a page with two photographs of nature and people plus color bars, and a single color photograph showing objects in motion.
Multifunction printers (MFPs) undergo additional tests for scanning speed and quality, as well as for copying speed and quality.
For comprehensive coverage of the Android ecosystem, visit Greenbot.com.