GIGABYTE RTX 3080 Ti EAGLE Overclocking

To overclock the GIGABYTE GeForce RTX 3080 Ti EAGLE 12G we can use GIGABYTE’s AORUS ENGINE software version 2.0.4.  The newest version was released on 6/17 and supports the RTX 3080 Ti.  It also includes RGB Fusion 2.0 software to control the RGB, it’s an optional install. 

GIGABYTE GeForce RTX 3080 Ti EAGLE 12G Video Card gigabyte aorus engine overclock

The GIGABYTE AORUS ENGINE software lets you control the GPU BOOST, MEMORY CLOCK, GPU VOLTAGE, FAN SPEED, POWER TARGET, and TARGET TEMP. Yes, you can unlock the Voltage. However, this is the rub on this video card, the Power Target advantage is very small. As you can see, we can only increase the Power Target up +5% from 100 to 105. That is very small. For comparison, the Founders Edition was able to be raised up 14% from 100 to 114, giving it more headroom than this video card.

Ultimately, this limits the card by TDP and means we hit that TDP wall much quicker when overclocking. We already determined that GDDR6X memory causes a large power hit, so with even less headroom, we have to be careful about that. However, it shouldn’t throttle due to the temperature on this video card. It also means, there is no way we can utilize a Voltage increase, that will just instantly put the card over the TDP. It’s best to let GPU Boost take control of Voltage.

The highest GPU Boost clock we could set, without harming performance, was +160 which brought the GPU Boost to 1825MHz. The highest we could set the memory was +1000 which brought the memory up to 20GHz from the default 19GHz, a 1GHz overclock. It wasn’t held back by temperature this time, it was held back by TDP. We think the memory really has the potential to at least hit 21GHz, if not held back by TDP. At 20GHz it is providing 960GB/s of memory bandwidth vs. 912GB/s at default 19GHz.

GIGABYTE GeForce RTX 3080 Ti EAGLE 12G Video Card overclocked gpu frequency graph

Above is our graph showing the default clock speed of the GIGABYTE GeForce RTX 3080 Ti EAGLE 12G (Blue) compared to the same video card now overclocked (Orange). You can see that at 1825MHz boost this video card seems to over around 1845-1865MHz while gaming. The average turns out to be 1864MHz. Since the default frequency average was 1764MHz this is a 6% overclock over the default frequency. This, therefore, will be our final overclock for our testing today 1864MHz/20GHz.

According to GPUz below it achieved this overclock at 100% fan sped at 60.9c and 67.1c hot spot. Memory temperature is very well managed at 82c with this overclock, temperature is not holding it back. GPU Voltage is at 1.0810V, which is boosted from the default Voltage, showing GPU Boost is managing it. The Power Consumption is above the TDP, so we are at the limits of this card in terms of power headroom.

GIGABYTE GeForce RTX 3080 Ti EAGLE 12G Video Card overclocked gpu-z full load sensor data

Brent Justice

Brent Justice has been reviewing computer components for 20+ years, educated in the art and method of the computer hardware review he brings experience, knowledge, and hands-on testing with a gamer oriented...

Join the Conversation

8 Comments

  1. Outstanding review, I think you showed for gaming, the 3080 Ti, the lowest end version from Gigabyte, keeps up and exceeds a 3090 OC version. In at least games, the 3090 makes very little sense for the extra money.

    Now rumors of better card availability abound as well with increase production and the bubble pop of Crypto. Hopefully good news for those looking to get this generation of video cards.

    Now Doom Eternal RT is performing very well with Nvidia and looks like AMD (but not as good) as well. I hope you get a chance to explore this update in the future.

  2. Game features are a thing I’d like to be able to test more, we have a lot of backlogged hardware to get through, but when I can, I would like to look at Ray Tracing and DLSS and FSR in games that have been released.
  3. In at least games, the 3090 makes very little sense for the extra money.

    Even JHH didn’t bill it as primarily a gaming card; they had a whole segment about using the extra VRAM for content creation and so on at the launch (the only one I’ve watched in my life).

    Obviously, more = better for some things, but the margins between these in terms of performance pale in comparison to just being able to buy any of them!

  4. This is the card I ended up with from Newegg’s launch day shuffle. So far it’s been a solid card and I’ve been able to play everything at 4k@100+ FPS. I average roughly 130 frames in Doom eternal 4k everything on and maxed out with the quality DLSS setting.

    The only down side (not really) is that it’s ugly as sin and while it’s not a deal breaker I’m casually looking to trade for an ASUS or keeping my fingers crossed I get my notification for the water cooled eVGA model.

  5. This is the card I ended up with from Newegg’s launch day shuffle. So far it’s been a solid card and I’ve been able to play everything at 4k@100+ FPS. I average roughly 130 frames in Doom eternal 4k everything on and maxed out with the quality DLSS setting.

    The only down side (not really) is that it’s ugly as sin and while it’s not a deal breaker I’m casually looking to trade for an ASUS or keeping my fingers crossed I get my notification for the water cooled eVGA model.

    I don’t think its ugly, but its too d@mn tall.

  6. I can see where height might be a problem, but if it were, wouldn’t a cooler like this not be optimal regardless most of the time?

    I was surprised that at 79% fan speed Brent could not hear it and 100% was quiet as well. So I guess priorities on noise or size have to come into play.

  7. I was surprised that at 79% fan speed Brent could not hear it and 100% was quiet as well. So I guess priorities on noise or size have to come into play.

    Brent has much more experience when it comes to what is quiet and what is loud in terms of GPUs. His opinion is arguably ‘qualified’ in this regard.

    And you’re absolutely right when it comes to priorities – I’d been wanting to ‘slim’ my desktop down for some time but I’ve found that such an endeavor would require some very precise planning if noise were to be kept in check. Essentially, as a rough measure, when keeping performance the same reductions in volume generally will cause increases in noise. It’s not linear but more ‘stepped’ so there’s some wiggle room at certain ranges of volume, in other words, you can usually drop volume a little bit without affecting noise or performance, but large drops in volume will generally result in quite a bit of noise increase.

    But the overall point is this: if the GPU is ‘too big’, perhaps it’s the case that’s too small ;)

Leave a comment