Power and Temperature

To test the power and temperature we perform a manual run-through in Cyberpunk 2077 at “Ultra” settings for real-world in-game data.  We use GPUz sensor data to record the results.  We report on the GPUz sensor data for “Board Power” and “GPU Chip Power” when available for our Wattage data. 

GIGABYTE GeForce RTX 3080 Ti EAGLE 12G Video Card board power draw

Looking at the Board Power Draw reported by GPU-Z it is no surprise the overclocked GIGABYTE GeForce RTX 3080 Ti EAGLE 12G consumes the most power, it is overclocked. The GIGABYTE GeForce RTX 3080 Ti EAGLE 12G at default actually chimes in with slightly lower board power compared to the GIGABYTE GeForce RTX 3090 GAMING OC. This makes sense, it has less RAM, but higher clock speeds on the GPU.

When we overclock it power demand goes up 6%. This is a perfect 1:1 performance lift we saw in games. We literally received an average of 6% performance gain, this is a very efficient performance to power increase and therefore the power draw increase makes perfect sense. Just be aware that it does draw this much power, so make sure your build can support that, with a good power supply.

GIGABYTE GeForce RTX 3080 Ti EAGLE 12G Video Card gpu temperature

Temperatures were good with the GIGABYTE GeForce RTX 3080 Ti EAGLE 12G video card, relatively speaking. At 68.7c it is hotter than the GIGABYTE GeForce RTX 3090 GAMING OC, but remember that both have similar coolers and the GIGABYTE RTX 3080 Ti EAGLE 12G has a higher sustained GPU clock frequency. When you compare these temperatures to the Founders Edition the GIGABYTE RTX 3080 Ti EAGLE 12G runs cooler at 68.7c versus 74.2c on the Founders Edition, a 7% improvement in temperatures.

One of the largest improvements was with the memory temperatures. With the card overclocked, temperatures were in the lower 80’s, and the Founders Edition was much warmer even at 100% fan speeds. Memory temperatures at default, or overclocking, are just cooler on the GIGABYTE GeForce RTX 3080 Ti EAGLE 12G.

We also want to give huge kudos to the fan noise on this video card. It’s quiet. We didn’t even realize it was running so high, 78%, at default operation. When we turned them up to 100% they were one of the quietest fans we’ve ever heard.

Brent Justice

Brent Justice has been reviewing computer components for 20+ years, educated in the art and method of the computer hardware review he brings experience, knowledge, and hands-on testing with a gamer oriented...

Join the Conversation

8 Comments

  1. Outstanding review, I think you showed for gaming, the 3080 Ti, the lowest end version from Gigabyte, keeps up and exceeds a 3090 OC version. In at least games, the 3090 makes very little sense for the extra money.

    Now rumors of better card availability abound as well with increase production and the bubble pop of Crypto. Hopefully good news for those looking to get this generation of video cards.

    Now Doom Eternal RT is performing very well with Nvidia and looks like AMD (but not as good) as well. I hope you get a chance to explore this update in the future.

  2. Game features are a thing I’d like to be able to test more, we have a lot of backlogged hardware to get through, but when I can, I would like to look at Ray Tracing and DLSS and FSR in games that have been released.
  3. In at least games, the 3090 makes very little sense for the extra money.

    Even JHH didn’t bill it as primarily a gaming card; they had a whole segment about using the extra VRAM for content creation and so on at the launch (the only one I’ve watched in my life).

    Obviously, more = better for some things, but the margins between these in terms of performance pale in comparison to just being able to buy any of them!

  4. This is the card I ended up with from Newegg’s launch day shuffle. So far it’s been a solid card and I’ve been able to play everything at 4k@100+ FPS. I average roughly 130 frames in Doom eternal 4k everything on and maxed out with the quality DLSS setting.

    The only down side (not really) is that it’s ugly as sin and while it’s not a deal breaker I’m casually looking to trade for an ASUS or keeping my fingers crossed I get my notification for the water cooled eVGA model.

  5. This is the card I ended up with from Newegg’s launch day shuffle. So far it’s been a solid card and I’ve been able to play everything at 4k@100+ FPS. I average roughly 130 frames in Doom eternal 4k everything on and maxed out with the quality DLSS setting.

    The only down side (not really) is that it’s ugly as sin and while it’s not a deal breaker I’m casually looking to trade for an ASUS or keeping my fingers crossed I get my notification for the water cooled eVGA model.

    I don’t think its ugly, but its too d@mn tall.

  6. I can see where height might be a problem, but if it were, wouldn’t a cooler like this not be optimal regardless most of the time?

    I was surprised that at 79% fan speed Brent could not hear it and 100% was quiet as well. So I guess priorities on noise or size have to come into play.

  7. I was surprised that at 79% fan speed Brent could not hear it and 100% was quiet as well. So I guess priorities on noise or size have to come into play.

    Brent has much more experience when it comes to what is quiet and what is loud in terms of GPUs. His opinion is arguably ‘qualified’ in this regard.

    And you’re absolutely right when it comes to priorities – I’d been wanting to ‘slim’ my desktop down for some time but I’ve found that such an endeavor would require some very precise planning if noise were to be kept in check. Essentially, as a rough measure, when keeping performance the same reductions in volume generally will cause increases in noise. It’s not linear but more ‘stepped’ so there’s some wiggle room at certain ranges of volume, in other words, you can usually drop volume a little bit without affecting noise or performance, but large drops in volume will generally result in quite a bit of noise increase.

    But the overall point is this: if the GPU is ‘too big’, perhaps it’s the case that’s too small ;)

Leave a comment