Gaming mice or just mice are available in all shapes and sizes. It's difficult to pick the right mouse for yourself and it's even more difficult to recommend someone since the majority of the factors are subjective. You might be an FPS player or a MOBA player, and the type of games you play, highly matter when you're choosing a mouse. Factors such as the size, weight, grip style, macros and buttons, etc., will affect your gameplay as it's all about personal preferences. Now you might think that the best sensor should be the way to go but for games other than first-person shooters, you don't really need an accurate sensor. Depending on the size of your hands and preferred grip style, you'll shortlist your options and then move on to the weight of the mouse. MOBAs and RPGs become easier to play with extra macro buttons, and extra buttons make the mouse heavier. FPS gamers prefer lightweight mice since there's a lot of swiping and lifting involved.
As you can imagine, all these factors make the testing complicated. We wanted to create a test process that would focus more on objectivity and keep subjectivity as far as we can in terms of performance. Read on to find out about how we test mice at our labs.
Testing rig and setup
To maintain a standard workflow, we test all the mice on a single rig since we discovered how the testing software could suffer anomalies with certain system settings. The software that we use to test the accuracy of the sensors are highly sensitive to certain programs running in the OS which have to be disabled while some have to be enabled. Hyperthreading is enabled (usually enabled by default), while HPET mode and USB Selective Suspend are disabled. Essentially, anything interfering with the state of the CPU have to be disabled to ensure they don’t utilise the CPU while recording sensor data since they can interfere with the polling rate on the mouse. Only the mouse being tested and a keyboard are connected to the rig while keeping the remaining USB ports free to avoid any error.
Motherboard: Biostar TPower X79
CPU: Intel Core i7 3960X
Memory: HyperX Fury 8GB DDR3
GPU: MSI Gaming X RX470
SSD: SanDisk Extreme II 240GB
HDD: WD Red 2TB
Cabinet: Thermaltake Level 10 GT
Power supply: Silver Power SP-SS650
OS: Windows 10 64-bit Professional
Although a robotic arm clamping the mouse over a spinning turntable would have been the ideal setup, instead we decided on restricting the window of collecting the sensor data. This ensures that the data gathered from the different mice fall under the same constraints and it becomes easier to calculate a final score. We use four different type of surfaces including a regular cloth mouse pad, the Razer Goliathus Control Edition medium mouse pad, a white hard plastic desk and a glass slab with a black cloth underneath. Measurements are marked on the desk itself to avoid using a ruler every time, which will obviously introduce errors every time. Around this area, a rough enclosure restricts the movement of the hand to execute accurate motions.
Razer Goliathus Control Edition medium mouse pad
To introduce objectivity, we use several software. The most important are MouseTester v1.5.3 and Enotus Mouse Test v0.1.4. MouseTester has built-in plotting of sensor data that lets you generate graphs depicting several parameters such as accuracy, polling rate, velocity, etc. We use MouseTester for every mouse set at 800 CPI to obtain the sensor’s accuracy when the mouse is given a quick swipe on a cloth mouse pad. Overall, only three CPI levels are used for all the tests – 800, 1,600 and 3,200. If the default CPI levels don't match our criteria, we try to record data on the closest CPI levels. To maintain consistency, we ensure the swipe speed to stay between a range of 2-2.4 m/s, while keeping the total time under 300 ms. Mouse settings are verified in the OS (Windows 10) as well, so the Pointer Speed is always set at the default 6/11 with Enhance pointer precision disabled (since it introduces acceleration).
Acceleration recorded in MouseTester on the Devastator II mouse
Enotus Mouse Test is used to obtain certain parameters, malfunction speed being an important one. The maximum speed at which the mouse can track movement is recorded, a test limited by the reviewer’s ability to swipe the mouse as fast as possible. This also brings in a real world parameter into the scoring and gives us information on how differently a mouse behaves on different surfaces.
In the beginning, the CPI is calculated using the Measure tool in MouseTester and to bring in more accuracy, it's calculated in MS Paint also (moving the cursor to the left edge of the canvas on x-coordinate 0 and then moving the mouse a distance of one inch which gives you the DPI from the value of the x-coordinate). The CPI was calculated every time prior to running any benchmark that involved an objective test, and also to verify whether different DPI levels advertised by the manufacturer were accurate.
A close to perfect graph generated on the Zowie FK2 with the PMW3310H sensor
Measuring the lift-off distance is done using the age old DVD technique where the mouse is kept on one DVD and then more are added until the sensor stops detecting. DVDs have a standard thickness of 1.2 mm, so if the sensor detects with one DVD and stops detecting with the second DVD, the recorded lift-off distance is entered as 1.2mm. Lift-off distance customisation is switched off in the mouse's software (if available).
Inconsistent polling rate on the Cooler Master MasterKeys Lite L mouse
Response time is evaluated but it's a subjective test since it is dependent on the reviewer’s reaction time. This also gives us a real world output on response time whenever the right button is clicked. We use two different tools – KeyResponsePK and Human Benchmark. KeyResponsePK calculates the response time between two successive clicks whereas Human Benchmark calculates the average reaction time.
Consistent polling rate recorded on the Razer DeathAdder Elite
Acceleration is tested inside CS: GO and MouseTester at different CPI levels. Any form of acceleration in the system and mouse software is disabled before testing. The age-old technique of shooting at a wall, swiping fast to the right or left and returning back to the original position is employed. In MouseTester, X and Y coordinates can be traced that gives us an idea whether there’s acceleration. Again, both the tests are subjective based on the reviewer's experience.
To test angle snapping, we disable it in the mouse’s software (if available) and draw different shapes on MS Paint. Drawing gives us an idea whether the sensor jitters at high CPI levels. The best way to determine whether prediction or angle snapping are present is to try to draw straight lines. If you're able to draw perfectly straight lines then angle snapping is present which is bad for gaming.
Angle snapping visible in MS paint on the Devastator II
As already mentioned, mice are diverse – some are packed with amazing features while some are plain and comfortable. Scoring is kept objective based on whether a particular feature is present on the mouse or not. They aren’t evaluated based on how well they perform since the implementation may vary among different manufacturers. For example, the procedure to change the RGB lighting modes on the mouse or its software won’t be the same for two different manufacturers. Software capability is scored, and again, only based on available features and not ease of use. Outrageous features aren't considered, only the ones that exist currently are considered. Things such as modularity to replace certain parts, a DPI profile switch button, extra teflon feet etc., are scored.
To keep the scores objective, rather than scoring based on how good they feel, we evaluate each mice considering the type of materials they use. We try to cover every part of the mouse in our calculation including the switches, cable and scroll wheel. The coating of a mouse is an important aspect to its build quality hence, we’ve further categorised the type of coating for various parts of the mouse. The touch and feel of the buttons, as well as the supported grips (palm or claw) aren't considered because they are subjective.
Essentially, the entire test process is a fight between the products to win three awards based on the final score. The first one is Best Performer, which goes to the one with the highest score achieved in terms of sensor accuracy only. No other parameter related to other aspects are considered. This award will be subject to change if a new mouse performs better than the previous one.
The Best Buy award goes to the mouse with the highest score/rupee ratio. In this case, the score will include sensor performance, along with the scores from features and build quality evaluation. This way, if a product happens to lack high-end features or build quality but has a lower MRP, the value for money score is automatically compensated. This is our way of maintaining fairness for the entry level mice in the comparison which don’t have flashy features.
The final, Editor’s Pick award is given based on the reviewer’s personal preference. This also means that the Best Performer or Best Buy can also win an Editor's Pick award. The judgement is completely subjective, depending on daily usage and gaming experience of the reviewer.
This article was first published in January 2017 issue of Digit magazine. To read Digit's articles first, subscribe here or download the Digit e-magazine app for Android and iOS. You could also buy Digit's previous issues here.