Why You Should Not Use FurMark, Kombustor, OCCT or powerMAX for stress testing

FurMark runs the GPU cooler than demanding games/benches at constant 100% load

On Nvidia GPUs, clocks speeds under load are limited by TBP (Total Board Power) and temperature. TBP includes the power consumption of not only the GPU core, but also the other components on the board such as VRAM and VRMs (Voltage Regulator Modules). On notebook GPUs, only the GPU core itself has a temperature sensor. The VRAM and VRMs do not have temperature sensors.

When you run FurMark, it sends an excessive current load through the VRMs, making them draw more power and run hotter than usual. TBP limit is still in effect, so as a result of the VRMs using more power, less power is allocated to the GPU core itself, causing it to run at lower clock speeds and voltages, and hence at lower temperatures, than in non-FurMark 3D loads.

The extreme amount of current being pushed through the VRMs, in conjunction with the increased VRM temps (which you cannot see!), when running FurMark increases the stress on those components and can reduce their lifespan. And in extreme cases, it can even lead to the VRMs burning out, especially on notebooks without active cooling, or even without passive cooling, on the VRMs.

FurMark is useless for stability testing when overclocking or undervolting

FurMark uses the OpenGL API, which very few games use, and this does make a difference when stability testing. Furthermore, because FurMark makes the GPU run at lower clock speeds and voltages, it cannot test the stability of the higher clock speeds and voltages reached in actual games and benchmarks. All of this means that settings which are stable under FurMark, may not pass in other 3D applications.

Edit
Pub: 07 May 2021 19:07 UTC
Edit: 07 May 2021 19:16 UTC
Views: 1011