NVIDIA GeForce Driver Power Mode Settings Compared

The FPS Review may receive a commission if you purchase something after clicking a link in this article.

Conclusion

NVIDIA offers three power mode settings under its driver control panel.  Under the Power Management Mode, you are presented with the default option “Optimal Power” but you also have “Adaptive” and “Prefer Maximum Performance.” 

A common question that comes up is if you should change that setting to get better gaming performance.  In this review today we tackled that question head-on with real-world practical testing.  We compared gaming performance in each power mode.  In addition, we looked at Power and Wattage, GPU Temperature and GPU Frequency to see if there are any hidden differences beyond gaming.  Here are our findings.

Power and Wattage

Power and Wattage are very important, you want the maximum performance, but you also want the most efficient way to get there.  In our testing, we found that the power modes do directly affect total system Idle Wattage.  Quite simply, “Optimal Power” and “Adaptive” provide the best Idle Wattage power.  Turning on “Prefer Maximum Performance” will make your Idle Wattage skyrocket.  The power savings of both Optimal and Adaptive are worth it. 

At full-load though, playing a game we noticed no differences in the peak total system Wattage.  None of the power modes saved power or made the GPU consume more power while gaming.  This was also confirmed with the GPUz Power Consumption board power number.  It seems there are no big differences no matter the power mode when playing games at full tilt. 

There also weren’t any major differences in GPU Temperature.  The RTX 2060 SUPER did fluctuate a few degrees total, but it wasn’t a major difference.

GPU Clock Frequency

This one is a bit more interesting.  We tested the GPU Clock frequency on both video cards in all three power modes.  It seems that on the GeForce RTX 2080 SUPER “Optimal Power” and “Adaptive” power have the exact same result.  They provided a consistent 1920MHz.

However, when we switched to “Prefer Maximum Performance” the clock speed dropped half-way through the game down to 1905MHz.  This caused a little loss in performance.  Our running theory is that at “Prefer Maximum Performance” it is keeping the voltage and other factors so high that it is actually hitting the power limit wall or TDP wall, and thus causing GPU Boost to throttle back the clock speed a bit.  It’s the best theory we have at the moment.

With the GeForce RTX 2060 SUPER this did not happen.  Therefore, it may only be something that can happen on certain levels of GPU performance.  It is enough of a difference though to say that you should just keep the setting on “Optimal” or “Adaptive.”

Gaming Performance

This is really where the rubber meets the road, the real-world gaming results.  What we found is a bit mixed.  It seems some games were dead even, and others had a few FPS differences.  However, we never found a pattern.  Which power mode was faster seemed to be random, and change every game.  Where one game benefited from one power mode, another game would be the opposite or the video cards themselves would flip flop.  It wasn’t really consistent and we couldn’t find a real clear pattern.

At the end of the day, the differences were tiny, and not noticeable in gameplay.  They could even be under the margin of error since we do run manual run-throughs in a couple of games. 

What we did not see though was any particular power mode giving us an overwhelming or noticeable advantage to our gameplay experience.  The experience was exactly the same between all three power modes on both video cards.   

Final Points

When it comes down to it, we tested many different combinations of scenarios today.  APIs DX11, DX12, and Vulkan games were included, so all-important APIs are covered.  Included are performance taxing games, to lesser performance taxing games.  We used built-in game benchmarks and real-world manual run-throughs in games.  Included is a midrange video card and a high-end performance video card.  Data points were taken in 3DMark, Power and Wattage in two different tests, and GPU Temperature.

While there are slight variances here and there, for the most part, they don’t make a big difference.  Based on our findings we recommend to keep the driver at its default setting of “Optimal Power”.  This seems to provide the best Idle power savings and gaming performance. 

It seems “Prefer Maximum Performance” might even hurt your performance on certain video cards in certain games. Unless you are having trouble getting your video card to run at the intended clock speeds, don’t enable this power mode.   It can actually cause your clock speeds to drop slightly, affecting performance. 

With Optimal Power we don’t really see the need for the Adaptive setting, they both almost do the same thing.  They both have the same power and performance.  In our opinion, NVIDIA should cut down the options back to just two toggles.  Optimal Power and Prefer Maximum Performance is all you need, and the latter being there for just legacy purposes.

For reviewers of computer hardware, benchmarkers, hobbyists and gamers we suggest leaving the Power Management Mode on the default “Optimal Power”.  At TheFPSReview that is the mode, we will use for all of our evaluations.  I.e., we will use driver defaults. 

Discussion

Brent Justicehttps://www.thefpsreview.com
Former managing editor of GPUs at HardOCP for 18 years, Brent Justice has been reviewing computer components since the late 90s, educated in the art and method of the computer hardware review, he brings experience, knowledge, and hands-on testing with a gamer-oriented and hardware enthusiast perspective. You can follow him on Twitter - @Brent_Justice You can sub to his YouTube channel - Justice Gaming https://www.youtube.com/c/JusticeGamingChannel You can check out his computer builds on KIT - @BrentJustice https://kit.co/BrentJustice

Recent News