Skip to Main Content
PCMag editors select and review products independently. If you buy through affiliate links, we may earn commissions, which help support our testing.

How We Test Desktops (2018 to Mid 2021 Methodology)

Standardized testing is an important facet of reviews at PCMag.com. Here's how we test every desktop PC we review.

By John Burek
& Tom Brant
November 22, 2021
(Photo: Zlata Ivleva)

Editors' Note: Starting in August 2021, this testing regimen has been superseded for desktop reviews by an updated methodology detailed here. For reference purposes, we've retained this summary of previous testing procedures, which applied to desktop systems reviewed from 2018 to mid-August 2021.


The process of reviewing desktop computers at PCMag.com carries on core traditions that date back to the establishment of PC Labs in 1984: We compare each system to others in its category on the basis of price, features, design, and in-house performance tests.

To evaluate performance, we use a suite of software-based benchmark tests and real-world applications and games, carefully chosen to highlight the strengths and weaknesses in the tested PC's mix of components. That evaluation ranges from the processor and the memory subsystem to the machine's storage hardware and graphics silicon.

In some cases, we make use of standardized tests created by established benchmark developers. We've also created our own tests where needed. We regularly evaluate new benchmark solutions as they hit the market and overhaul our testing procedures as needed to ensure that we can accurately reflect the effects of the latest technologies.

Desktop tower
(Photo: Zlata Ivleva)

Our desktop PC testing breaks down into two rough classes: productivity testing and graphics testing, with some supplemental tests for specialized systems such as gaming rigs or workstations. Here's a breakdown of each.

Productivity Testing

PCMark 10

Our first task is evaluating a computer's everyday productivity performance using UL's PCMark 10 benchmark, which simulates real-world productivity and content-creation workflows. (In 2014, Underwriters Labs acquired Futuremark, the maker of the long-running PCMark and 3DMark benchmarks.)

We use PCMark 10 to assess overall performance for office-centric tasks such as word processing, spreadsheet jockeying, web browsing, and videoconferencing. The test generates a proprietary numeric score; higher numbers are better, and the scores are meaningful primarily when compared to one another.

PCMark 10 software

We run the main test suite supplied with the software, not the Express or Extended version. Note that all else being equal, a higher screen resolution will mean lower performance in PCMark 10 (the more pixels to push, the more resources required). For that reason, we test all desktop PCs at 1,920 by 1,080 pixels (1080p), save for all-in-one (AIO) desktops with built-in displays. We test those at the screen's native resolution, which may be higher or lower than 1080p.

PCMark 8 Storage

We assess the speed of the PC's main boot drive using another UL benchmark, PCMark 8. This test suite has a dedicated PCMark 8 Storage subtest that reports a proprietary numeric score, like so...

PCMark 8 software

As with PCMark 10, higher numbers are better. The results from systems with cutting-edge solid-state drives (SSDs) tend to cluster together closely on this test.

Cinebench R15

Next in line is Maxon's CPU-crunching Cinebench R15 test. We run this test at the All Cores setting. Derived from Maxon's Cinema 4D modeling and rendering software, this test is a CPU horsepower test that's fully threaded to make use of all available processor cores and threads. Think of it as an all-out processor deadlift.

Cinebench R15 software

Cinebench stresses the CPU rather than the GPU to render a complex image. The result is a proprietary score indicating a PC's suitability for processor-intensive workloads, when used with software that is fully threaded.

Handbrake 1.1.1

Cinebench is often a good predictor of our Handbrake video-editing trial. This is another tough, threaded workout that's highly CPU-dependent and scales well as you add cores and threads.

Handbrake software

In this test, we put a stopwatch on test systems as they transcode a standard 12-minute clip of 4K video (the open-source Blender demo short movie Tears of Steel) to a 1080p MP4 file. We use the Fast 1080p30 preset in version 1.1.1 of the Handbrake app for this timed test. Lower results (i.e., faster times) are better.

Adobe Photoshop CC Photo Editing Test

Our final productivity test is a custom Adobe Photoshop image-editing benchmark. Using an early 2018 release of the Creative Cloud version of Photoshop, we apply a series of complex filters and effects (Dust, Watercolor, Stained Glass, Mosaic Tiles, Extrude, and multiple blur effects) to a PCMag-standard JPEG image. (We use a script executed via an Actions file of our own making.) We time each operation and add up the total. As with Handbrake, lower times are better.

Photoshop program with photo

The Photoshop test stresses the CPU, storage subsystem, and RAM, but it can also take advantage of most GPUs to speed up the process of applying filters. Systems with powerful graphics cards may see a boost from that.

Graphics Performance

Judging graphics performance requires using tests that are challenging to every system yet yield meaningful comparisons across the field. We use some benchmarks that report proprietary scores and others that measure frames per second (fps), the frequency at which the graphics hardware renders frames in a sequence, which translates to how smooth the scene looks in motion.

Synthetic Tests: 3DMark and Superposition

The first graphics test we employ is UL's 3DMark. The 3DMark suite includes a variety of different subtests that measure relative graphics muscle by rendering sequences of highly detailed, gaming-style 3D graphics. Many of these tests emphasize particles and lighting.

We run two different 3DMark subtests, Sky Diver and Fire Strike, which are suited to different types of systems. Both are DirectX 11 benchmarks, but Sky Diver is suited to laptops and midrange PCs, while Fire Strike is more demanding and made for high-end PCs to strut their stuff. The results are proprietary scores.

3DMark software

Also in our graphics mix is another synthetic graphics test, this time from Unigine. Like 3DMark, the Superposition test renders and pans through a detailed 3D scene and measures how the system copes. In this case, the rendering takes place in the company's eponymous Unigine engine, offering a different 3D workload scenario than 3DMark for a second opinion on the machine's graphical prowess.

3D render of room with desk inside workshop

We present two Superposition results, run at the 720p Low and 1080p High presets. The scores are reported in frames per second, higher frame rates being better. For lower-end PCs, maintaining at least 30fps is the realistic target, while more powerful computers should ideally attain at least 60fps at the test resolution.

Real-World Gaming Tests

The synthetic tests above are helpful for measuring general 3D graphics aptitude, but it's hard to beat full retail video games for judging gaming performance. Far Cry 5 and Rise of the Tomb Raider are both modern, high-fidelity titles with built-in benchmarks that illustrate how a system handles real-world video games at various settings.

Far Cry 5 opening title screen Graphics settings

We run these games at both their moderate and maximum graphics-quality presets (Normal and Ultra for Far Cry 5, Medium and Very High for Rise of the Tomb Raider). We test by default at 1080p, sometimes testing at higher resolutions like 3,840 by 2,160 pixels (4K) if the system configuration warrants it. We also test all-in-one PCs at their native screen resolution if the latter differs from 1080p or 4K.

Video game graphics test Graphics settings test for video game

These results are also provided in frames per second. Far Cry 5 is a DirectX 11-based game, while we flip Rise of the Tomb Raider to DirectX 12 mode.

Special Cases: macOS Systems and Workstations

We don't run all of the above tests on every desktop. We only run Far Cry 5 and Rise of the Tomb Raider on systems specifically designed for gaming, equipped with one or more discrete graphics cards. And we don't use PCMark, 3DMark, or Superposition for testing Apple desktops, since these tests have no macOS versions. To evaluate some specialized subsets of desktops, such as workstations and Chrome OS machines, we supplement our standard tests.

Chrome OS

Chrome OS desktops (Chromeboxes) are relatively rare nowadays, and none of the above tests is compatible with Chrome OS. We therefore run two benchmark tests from Principled Technologies (CrXPRT and WebXPRT) to help us make comparisons among Chrome machines. These are single-click tests without settings to tweak, and they report proprietary scores that are meaningful only relative to one another.

Desktop Workstations

For computer-aided design (CAD), CGI rendering, or data science workstations, we run our usual tests and supplement them with two workstation-specific benchmarks. The first is the multimedia rendering tool POV-Ray, which performs ray-tracing operations. The other, SPECviewperf, uses viewsets from the independent software vendor (ISV) apps Creo, Maya, and SolidWorks to render, rotate, and zoom in and out of solid and wireframe 3D models. POV-Ray results are reported in time to completion, while SPECviewperf reports frames per second.

Get Our Best Stories!

Sign up for What's New Now to get our top stories delivered to your inbox every morning.

This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of Use and Privacy Policy. You may unsubscribe from the newsletters at any time.


Thanks for signing up!

Your subscription has been confirmed. Keep an eye on your inbox!

Sign up for other newsletters

TRENDING

About John Burek

Executive Editor and PC Labs Director

I have been a technology journalist for almost 30 years and have covered just about every kind of computer gear—from the 386SX to 64-core processors—in my long tenure as an editor, a writer, and an advice columnist. For almost a quarter-century, I worked on the seminal, gigantic Computer Shopper magazine (and later, its digital counterpart), aka the phone book for PC buyers, and the nemesis of every postal delivery person. I was Computer Shopper's editor in chief for its final nine years, after which much of its digital content was folded into PCMag.com. I also served, briefly, as the editor in chief of the well-known hardcore tech site Tom's Hardware.

During that time, I've built and torn down enough desktop PCs to equip a city block's worth of internet cafes. Under race conditions, I've built PCs from bare-board to bootup in under 5 minutes.

In my early career, I worked as an editor of scholarly science books, and as an editor of "Dummies"-style computer guidebooks for Brady Books (now, BradyGames). I'm a lifetime New Yorker, a graduate of New York University's journalism program, and a member of Phi Beta Kappa.

Read John's full bio

Read the latest from John Burek

About Tom Brant

Deputy Managing Editor

I’m the deputy managing editor of the hardware team at PCMag.com. Reading this during the day? Then you've caught me testing gear and editing reviews of laptops, desktop PCs, and tons of other personal tech. (Reading this at night? Then I’m probably dreaming about all those cool products.) I’ve covered the consumer tech world as an editor, reporter, and analyst since 2015.

I’ve evaluated the performance, value, and features of hundreds of personal tech devices and services, from laptops to Wi-Fi hotspots and everything in between. I’ve also covered the launches of dozens of groundbreaking technologies, from hyperloop test tracks in the desert to the latest silicon from Apple and Intel.

I've appeared on CBS News, in USA Today, and at many other outlets to offer analysis on breaking technology news.

Before I joined the tech-journalism ranks, I wrote on topics as diverse as Borneo's rain forests, Middle Eastern airlines, and Big Data's role in presidential elections. A graduate of Middlebury College, I also have a master's degree in journalism and French Studies from New York University.

Read Tom's full bio

Read the latest from Tom Brant