Image: XFX

XFX has partnered with EKWB to create the Speedster Zero Radeon RX 6900 XT. XFX says that the two-slot card can be overclocked to 3 GHz. This is plausible, as there are Radeon RX 6900 XT cards that can exceed 3 GHz. “The base of the block is CNC-machined out of nickel-plated high-grade copper, while its top is CNC-machined out of glass-like cast Acrylic.” It comes with pre-installed brass standoffs and high-quality EPDM O-rings. ARGB is also present.

It features a number of upgrades over the reference design. Power delivery has been increased from an 11+1 to 14+2 VRM phase design. There is dual BIOS. The PCB features 3x 8-pin connectors, and while TGP isn’t listed, it could reach well over 400 watts when overclocked. The base clock is up to 2,200 MHz and boost clock up to 2,525 MHz,. An 850-watt PSU is required, while a 1000-watt PSU is recommended. It has 3x DP 1.4 DSC, 1x HDMI 2.1 VRR and FRL, and 1x USB-C. Pricing has not been announced.

The card comes with a 14+2 phase VRM design and Dual-BIOS, increasing overclock potential and bringing an extra safety layer if a BIOS gets corrupted. Out of the box, the GPU boost clocked is rated at 2525MHz, but XFX promises it can surpass 3000MHz if overclocked.

Source: XFX (via KitGuru)

Don’t Miss Out on More FPS Review Content!

Our weekly newsletter includes a recap of our reviews and a run down of the most popular tech news that we published.

Peter Brosdahl

As a child of the 70’s I was part of the many who became enthralled by the video arcade invasion of the 1980’s. Saving money from various odd jobs I purchased my first computer from a friend of my...

Join the Conversation

53 Comments

  1. [QUOTE=”Riccochet, post: 41566, member: 4″]
    Sooooo….where can I buy one?
    [/QUOTE]
    I clicked on their “where to buy” button and the usual suspects came up. I tried BB but nothing showed. I imagine it’s just a matter of time until the retailers have their listings updated and live.

  2. [QUOTE=”Peter_Brosdahl, post: 41572, member: 87″]
    I clicked on their “where to buy” button and the usual suspects came up. I tried BB but nothing showed. I imagine it’s just a matter of time until the retailers have their listings updated and live.
    [/QUOTE]
    and for those glorious 15-20 seconds you can buy one of the handful available.

  3. [QUOTE=”Grimlakin, post: 41592, member: 215″]
    and for those glorious 15-20 seconds you can buy one of the handful available.
    [/QUOTE]
    sad but true!

  4. If the 3000mhz is true, it could actually be quite good. Maybe even beating the 3090.

    I like putting my own water blocks on my GPU’s, but you still wind up with a card that is electrically optimized for air cooling, so you are never going get the max benefit of the better cooling.

    With something like this, designed from the ground up with the power delivery (and probably the GPU chip binning) to make the max benefit of water cooling, you can achieve some pretty cool things.

    I’d consider this card if I could find it in stock. Would be neat to go AMD again for the first time in a while.

  5. This or the Aorus 6900XT Extreme Waterforce are the two cards I’m interested in.

    I don’t even think that Aorus card exists outside of a picture on their website. I’ve never seen one in the wild.

  6. [QUOTE=”Zarathustra, post: 41700, member: 203″]
    If the 3000mhz is true, it could actually be quite good. Maybe even beating the 3090.

    I like putting my own water blocks on my GPU’s, but you still wind up with a card that is electrically optimized for air cooling, so you are never going get the max benefit of the better cooling.

    With something like this, designed from the ground up with the power delivery (and probably the GPU chip binning) to make the max benefit of water cooling, you can achieve some pretty cool things.

    I’d consider this card if I could find it in stock. Would be neat to go AMD again for the first time in a while.
    [/QUOTE]
    I think that’s true to a certain point, but isn’t the 3090 Kingpin basically that kind of design and the gains are still pretty incremental?

  7. Meh.

    Newegg had this one on pre-order. I hesitated and didn’t do the pre-order. Instead I refreshed it last night, thinking I might order one just as it released for regular sales. Well that was a mistake.

    It INSTANTLY went from “Launch Date 10/01/2021” to “out of stock”.

    Oh well.

  8. [QUOTE=”Zarathustra, post: 41973, member: 203″]
    Meh.

    Newegg had this one on pre-order. I hesitated and didn’t do the pre-order. Instead I refreshed it last night, thinking I might order one just as it released for regular sales. Well that was a mistake.

    It INSTANTLY went from “Launch Date 10/01/2021” to “out of stock”.

    Oh well.
    [/QUOTE]

    Huh, right after I posted this I loaded the Newegg store page again, and what do you know, [URL=’https://www.newegg.com/xfx-radeon-rx-6900-xt-rx-69xtawbd9/p/N82E16814150863′]there it was in stock,[/URL] so I bought one.

    I’m guessing either some who pre-ordered had their cards declined, or tried to order more than the 2 limit per customer. Who knows. Let’s just say that I’m not selling my old GPU until I have the thing in my hands. Who knows if it will ship, and if it does, who knows if it will get here.

    I still can’t help but think it is a bad deal for $1,799, but I am so tired of using my 5 year old GPU at this point…

    At least this is the most badass version of the 6900XT I’ve seen to date…

    I guess now it’s only a matter of time until I find out if it is a 3090 killer. Let me know if you have any benchmark requests…

  9. [QUOTE=”Zarathustra, post: 41974, member: 203″]
    Huh, right after I posted this I loaded the Newegg store page again, and what do you know, [URL=’https://www.newegg.com/xfx-radeon-rx-6900-xt-rx-69xtawbd9/p/N82E16814150863′]there it was in stock,[/URL] so I bought one.

    I still can’t help but think it is a bad deal for $1,799, but I am so tired of using my 5 year old GPU at this point…

    At least this is the most badass version of the 6900XT I’ve seen to date…

    I guess now it’s only a matter of time until I find out if it is a 3090 killer. Let me know if you have any benchmark requests…
    [/QUOTE]

    Honestly I’ve been very happy with my 6800xt so I expect you will enjoy this one quite well.

    You are in the same boat I am on CPU’s though. Once you have the new card let me know how bad the itch is to get a 5000 series Ryzen CPU. 😉

  10. [QUOTE=”Grimlakin, post: 41977, member: 215″]
    Honestly I’ve been very happy with my 6800xt so I expect you will enjoy this one quite well.

    You are in the same boat I am on CPU’s though. Once you have the new card let me know how bad the itch is to get a 5000 series Ryzen CPU. 😉
    [/QUOTE]

    I already have a Threadripper 3960x.

    If I decide I need a CPU upgrade, I’m probably going to go for a drop in next gen Threadripper upgrade.

    I just can’t live without my PCIe lanes!

    I generally play games at 4k Ultra, so having a CPU that can support super high frame rates is of lower importance to me.

  11. I just hope – for the love of all that is holy – that I can turn off the damn Christmas tree lights without mucking around and installing apps and shit.

    I don’t want to have more useless software running on my machine.

  12. [QUOTE=”Zarathustra, post: 41997, member: 203″]
    I just hope – for the love of all that is holy – that I can turn off the **** Christmas tree lights without mucking around and installing apps and ****.

    I don’t want to have more useless software running on my machine.
    [/QUOTE]
    I hate that crap too. Occasionally I do want some extra flare but it’s so annoying having to install software for such things. It’d sure be nice if everyone could settle on a standard have it baked into the OS/DX/API.

  13. [QUOTE=”Peter_Brosdahl, post: 41998, member: 87″]
    I hate that crap too. Occasionally I do want some extra flare but it’s so annoying having to install software for such things. It’d sure be nice if everyone could settle on a standard have it baked into the OS/DX/API.
    [/QUOTE]

    Ah, but if they did that they couldn’t all have their own little app collecting monetizeable ad data on you, now could they? :p

  14. Huh, surprisingly it is still in stock.

    This is supposed to be a limited edition card in a year when even regular cards are unobtainable or overpriced, yet there it sits in stock for quite a while now at AIB MSRP a $500+ less than my local Microcenter has been asking for vanilla 6900 xt’s…

    Strange.

    Did the GPU bubble just burst or something? :p

  15. [QUOTE=”Endgame, post: 42033, member: 1041″]
    I’m not really a fan of EKWB, but that is kind of tempting.
    [/QUOTE]

    Yeah, I hear what you are saying.

    I think they’ve had their stumbles, but they are mostly OK, and if you want a fullcover block for any random GPU out there you happen to be able to pick up, sometimes they are your only alternative, especially in this market where you are lucky to get a GPU at all, and can’t pick and choose the one you want based on which block will fit.

    I can think of two negatives about them.

    1.) That nickel coating fiasco a decade ago.

    I was really disappointed in that they didn’t take full responsibility for it and make their customers whole, but at least they learned from it and improved their plating process, so newer parts shouldn’t have the same problem.

    2.) The shitty design on their first Threadripper block.

    They assumed Threadripper was going to be a low volume chip no one would buy, so in order to make a cost-effective block for it, they stretched their existing base resulting in a small fin area compared to the large CPU. Turns out their competitors went all in and designed special CPU blocks just for threadripper. When they realized this (and when they got some bad press) they did the same and launched a larger v2 block that performs well.

    I’ve had two of their nickel plated blocks, my old CPU block (Supremacy EVO full copper) and my current GPU block for my Pascal Titan. Neither have had any issues and both have performed very well.

    This particular GPU spoke to me because it was available, it was considered one of the most performant 6900xt’s on the market, and I wouldn’t have to go out trying to find a full cover block that fit it, and have it shipped from slow-venia making me wait for my install. This is going to make it easy to just plop it in there and go!

    When I first started buying water blocks in 2016 (I was late to the party) I struggled with whether or not I wanted to give my money to a company that had screwed over its customers like they had with the nickel plating issue, but I quickly concluded that if I wanted to cool the components I had, (and get the best performance while doing so) I didn’t have much of a choice. I got over it over time.

    The blocks I have bought have worked for me, but when I upgraded to a Threadripper 2 years ago I decided to go with Watercool’s block instead, and it has not disappointed. Still on the GPU side the options are more limited. Watercool has great blocks, but only for a small number of cards. You buy anything outside of reference layout and they are pretty much out of the picture, and so few boards are reference layout these days. There’s the Alphacool adaptable design, where you keep the same central block, and just replace the mounting hardware and fullcover portion to fit your new GPU. It’s a cool concept, but they don’t perform as well sadly. Then there is the Chinese junk like Bykski. No way I am tryusting water near my expensive parts to a Chinese company.

    …so I keep coming back to EK being my only choice in many cases.

  16. [QUOTE=”Zarathustra, post: 41974, member: 203″]
    Huh, right after I posted this I loaded the Newegg store page again, and what do you know, [URL=’https://www.newegg.com/xfx-radeon-rx-6900-xt-rx-69xtawbd9/p/N82E16814150863′]there it was in stock,[/URL] so I bought one.
    [/QUOTE]
    Still in stock now actually – which is amazing.

    Only thing I can figure is that it’s a full WC card and you’d need a custom loop in order to run it – so it isn’t plug-and-play like nearly every other card is. That would probably keep most miners and some scalpers at bay.

  17. [QUOTE=”Zarathustra, post: 42034, member: 203″]
    Yeah, I hear what you are saying.

    I think they’ve had their stumbles, but they are mostly OK, and if you want a fullcover block for any random GPU out there you happen to be able to pick up, sometimes they are your only alternative, especially in this market where you are lucky to get a GPU at all, and can’t pick and choose the one you want based on which block will fit.

    I can think of two negatives about them.

    1.) That nickel coating fiasco a decade ago.

    I was really disappointed in that they didn’t take full responsibility for it and make their customers whole, but at least they learned from it and improved their plating process, so newer parts shouldn’t have the same problem.

    2.) The ****ty design on their first Threadripper block.

    They assumed Threadripper was going to be a low volume chip no one would buy, so in order to make a cost-effective block for it, they stretched their existing base resulting in a small fin area compared to the large CPU. Turns out their competitors went all in and designed special CPU blocks just for threadripper. When they realized this (and when they got some bad press) they did the same and launched a larger v2 block that performs well.

    I’ve had two of their nickel plated blocks, my old CPU block (Supremacy EVO full copper) and my current GPU block for my Pascal Titan. Neither have had any issues and both have performed very well.

    This particular GPU spoke to me because it was available, it was considered one of the most performant 6900xt’s on the market, and I wouldn’t have to go out trying to find a full cover block that fit it, and have it shipped from slow-venia making me wait for my install. This is going to make it easy to just plop it in there and go!

    When I first started buying water blocks in 2016 (I was late to the party) I struggled with whether or not I wanted to give my money to a company that had screwed over its customers like they had with the nickel plating issue, but I quickly concluded that if I wanted to cool the components I had, (and get the best performance while doing so) I didn’t have much of a choice. I got over it over time.

    The blocks I have bought have worked for me, but when I upgraded to a Threadripper 2 years ago I decided to go with Watercool’s block instead, and it has not disappointed. Still on the GPU side the options are more limited. Watercool has great blocks, but only for a small number of cards. You buy anything outside of reference layout and they are pretty much out of the picture, and so few boards are reference layout these days. There’s the Alphacool adaptable design, where you keep the same central block, and just replace the mounting hardware and fullcover portion to fit your new GPU. It’s a cool concept, but they don’t perform as well sadly. Then there is the Chinese junk like Bykski. No way I am tryusting water near my expensive parts to a Chinese company.

    …so I keep coming back to EK being my only choice in many cases.
    [/QUOTE]
    I’ve been doing blocks since 2002 – first I used danger den, then swiftech, and now Optimus. So far I’m really enjoying Optimus on my 5950x, but while their blocks are top performers, I can’t find a video card in stock they have a block for.

    i would be more inclined to go with EK if I could get a bare copper block – I don’t really care for nickel in the first place, and then add the whole plating issue when I was first considering using them ….

  18. [QUOTE=”Zarathustra, post: 42034, member: 203″]
    Yeah, I hear what you are saying.

    I think they’ve had their stumbles, but they are mostly OK, and if you want a fullcover block for any random GPU out there you happen to be able to pick up, sometimes they are your only alternative, especially in this market where you are lucky to get a GPU at all, and can’t pick and choose the one you want based on which block will fit.

    I can think of two negatives about them.

    1.) That nickel coating fiasco a decade ago.

    I was really disappointed in that they didn’t take full responsibility for it and make their customers whole, but at least they learned from it and improved their plating process, so newer parts shouldn’t have the same problem.

    2.) The ****ty design on their first Threadripper block.

    They assumed Threadripper was going to be a low volume chip no one would buy, so in order to make a cost-effective block for it, they stretched their existing base resulting in a small fin area compared to the large CPU. Turns out their competitors went all in and designed special CPU blocks just for threadripper. When they realized this (and when they got some bad press) they did the same and launched a larger v2 block that performs well.

    I’ve had two of their nickel plated blocks, my old CPU block (Supremacy EVO full copper) and my current GPU block for my Pascal Titan. Neither have had any issues and both have performed very well.

    This particular GPU spoke to me because it was available, it was considered one of the most performant 6900xt’s on the market, and I wouldn’t have to go out trying to find a full cover block that fit it, and have it shipped from slow-venia making me wait for my install. This is going to make it easy to just plop it in there and go!

    When I first started buying water blocks in 2016 (I was late to the party) I struggled with whether or not I wanted to give my money to a company that had screwed over its customers like they had with the nickel plating issue, but I quickly concluded that if I wanted to cool the components I had, (and get the best performance while doing so) I didn’t have much of a choice. I got over it over time.

    The blocks I have bought have worked for me, but when I upgraded to a Threadripper 2 years ago I decided to go with Watercool’s block instead, and it has not disappointed. Still on the GPU side the options are more limited. Watercool has great blocks, but only for a small number of cards. You buy anything outside of reference layout and they are pretty much out of the picture, and so few boards are reference layout these days. There’s the Alphacool adaptable design, where you keep the same central block, and just replace the mounting hardware and fullcover portion to fit your new GPU. It’s a cool concept, but they don’t perform as well sadly. Then there is the Chinese junk like Bykski. No way I am tryusting water near my expensive parts to a Chinese company.

    …so I keep coming back to EK being my only choice in many cases.
    [/QUOTE]
    Bykski has been my only option for my last 2 cards (Gigabyte 1080Ti, and Asrock 6900XT Phantom Gaming). The RGB controller that came with the 1080Ti block didn’t work – no loss. The backplate for the 6900XT was too far off the PCB to provide any cooling, so I kept the stock backplate when I put the block on.

    The blocks themselves have fit well, worked well, and been trouble-free. I’d rather buy Alphacool, EK, etc. but Bykski gets the job done and makes blocks for the cards other companies won’t. With the current shipping situation, though, I wouldn’t count on getting items from China on schedule.

  19. I’ll never buy another Alphacool GPU block, ever. They have the worst support, and the blocks are not designed to be taken apart and cleaned. The metal shroud covering the front of the block is glued/taped in place. Takes a lot of prying, and bending, to get it off to expose the screws holding the block together. If you damage it, you’re SOL because Alphacool does not sell replacement parts for their blocks. Nor do they recommend their blocks be taken apart at all.

    Also, I run nothing but distilled with Primochill biocide. In all the years, and all the blocks I’ve owned, the Alphacool block is the only one to be effected by this coolant combo. Fins have become discolored and the finish is coming off.

    If you’re down for a block that’s basically disposable, go for it.

  20. [QUOTE=”Riccochet, post: 42043, member: 4″]
    I’ll never buy another Alphacool GPU block, ever. They have the worst support, and the blocks are not designed to be taken apart and cleaned. The metal shroud covering the front of the block is glued/taped in place. Takes a lot of prying, and bending, to get it off to expose the screws holding the block together. If you damage it, you’re SOL because Alphacool does not sell replacement parts for their blocks. Nor do they recommend their blocks be taken apart at all.

    Also, I run nothing but distilled with Primochill biocide. In all the years, and all the blocks I’ve owned, the Alphacool block is the only one to be effected by this coolant combo. Fins have become discolored and the finish is coming off.

    If you’re down for a block that’s basically disposable, go for it.
    [/QUOTE]

    Interesting. I’ve never actually used an Alphacool water block. I’ve used a ton of their radiators though, and apart from them shipping with a shit ton of flux in them requiring an above average amount of cleaning, they are pretty damn good.

  21. [QUOTE=”Zarathustra, post: 42056, member: 203″]
    Interesting. I’ve never actually used an Alphacool water block. I’ve used a ton of their radiators though, and apart from them shipping with a **** ton of flux in them requiring an above average amount of cleaning, they are pretty **** good.
    [/QUOTE]
    For the money Barrow radiators have done me quite well. Also run Barrow fittings as they’re 1/4 the price of Bitspower.

  22. [QUOTE=”Riccochet, post: 42062, member: 4″]
    For the money Barrow radiators have done me quite well. Also run Barrow fittings as they’re 1/4 the price of Bitspower.
    [/QUOTE]

    I still have some discomfort using a Chinese off-brand for anything that has a potential for things going bad if it leaks.

    A couple of years ago my XSPC swivel bends started leaking, and I replaced them all with Bitspower. Not cheap, but they have been perfect and I have more peace of mind. To me that peace of mind is well worth the Bitspower money…

    I’ve intended to switch all of my XSPC compression fittings to Bitspower at some point as well, but I just haven’t gotten around to it. It’s less about the money, and mote about just not having time to drain, disassemble, reassemble and fill the damn thing again…

  23. So, this bad boy is out for delivery as we speak.

    I’m going to hook it up to my spare pump and res and give it a thorough rinse and cleaning before dropping it in the loop.

    I have a rehearsal and wedding to go to Thursday/Friday (damn, there are so many of them these days) so I won’t be able to install it until late weekend probably.

    Side note:

    What are we using software wise to overclock AMD GPU’s these days?

    Is the good old MSI Afterburner still the way to go, or is there something better?

    [ATTACH type=”full”]1259[/ATTACH]

  24. Well, how about that.

    I just had a surprisingly large box arrive…

    [ATTACH type=”full”]1260[/ATTACH]

  25. This thing is surprisingly heavy. Much more so than I remember my Pascal Titan X with block installed being….

    (Though I haven’t done a side by side as it is still going to be installed until I’m done cleaning the new one.

    [ATTACH type=”full”]1261[/ATTACH]

  26. The back plate is metal and like 1/8 thick throughout. It’s like the damn thing is armored.

    [ATTACH type=”full”]1272[/ATTACH]
    [ATTACH type=”full”]1273[/ATTACH]

  27. Sadly the RGB LED cable is not detachable from the block, so you are stuck with it whether or not you want it…

    [ATTACH type=”full”]1265[/ATTACH]

    (Unless… You know… . But that probably impacts warranty and resale value. I guess I’ll have to try to hide it somehow.)

    I don’t even know what kind of connector this is.

    [ATTACH type=”full” alt=”PXL_20211006_221230595~2.jpg”]1266[/ATTACH]

    I don’t do that Christmas tree lighting BS in my builds…

  28. Here’s hoping the motherboard PCIe slot can handle the weight…

    I cannot understate how surprisingly heavy this thing is.

    Normally I like that, it makes it feel sturdy and of quality construction, but this may be a little bit much, even for me…

  29. It comes with some sort of weird EK hex wrench. I’m guessing that must be for the G1/4 port caps.

    [ATTACH type=”full” alt=”PXL_20211006_221706104~2.jpg”]1267[/ATTACH]

    Also not quite sure what all these screws are for. In case you take the block/backplate off and lose some?

    Going to have to read the manual…

    [ATTACH]1271[/ATTACH]

  30. Kind of curious why they didn’t make it single slot. It looks as if it would have fit…

    And it’s not like there are any ports in the second slot area or anything.

    [ATTACH type=”full”]1268[/ATTACH]

  31. Anyway. Time for dinner.

    Then I have to make a microcenter run.

    Apparently I don’t have enough fittings in my spare parts bin to hook this up to my spare XSPC pump/res for cleaning without disassembling the entire computer…

    [ATTACH type=”full”]1270[/ATTACH]

    That’s both one huge reservoir and one huge GPU…

  32. That is a standard RGB cable. That said, wire snips would take care of it if you didn’t want it. There is probably a RGB header on the motherboard to plug it into though.

  33. [QUOTE=”Brian_B, post: 42182, member: 96″]
    That is a standard RGB cable. That said, wire snips would take care of it if you didn’t want it. There is probably a RGB header on the motherboard to plug it into though.
    [/QUOTE]

    You know, it would be nice if it were detachable for those of us who aren’t into the gaudy lighting, but based on the fact that it sits low on the card I can probably tuck it through a hole and hide it behind the motherboard tray.

    I’m a little annoyed that I have to, but rather than risk voiding my warranty, I’ll learn to live with it…

  34. Mine arrives tomorrow. It’ll be another week or so before I get to installing it. Draining my loop is a royal PITA.

  35. [QUOTE=”Riccochet, post: 42184, member: 4″]
    Mine arrives tomorrow. It’ll be another week or so before I get to installing it. Draining my loop is a royal PITA.
    [/QUOTE]

    Tell me about it. That’s why I invested in several koolance QDC’s. This is actually the first time since installing them I am replacing a component. I’m excited!

  36. [QUOTE=”Zarathustra, post: 42186, member: 203″]
    Tell me about it. That’s why I invested in several koolance QDC’s. This is actually the first time since installing them I am replacing a component. I’m excited!
    [/QUOTE]
    Yeah the QD’s are nice if you swap components often. Mine is hard lined. I got lucky going from a 1080Ti with a EK block to a 2080Ti with a Alphacool block and not having to redo any lines.

  37. [QUOTE=”Zarathustra, post: 42177, member: 203″]
    Then I have to make a microcenter run.

    Apparently I don’t have enough fittings in my spare parts bin to hook this up to my spare XSPC pump/res for cleaning without disassembling the entire computer…

    [/QUOTE]

    Huh…

    Turns out Microcenter doesn’t have compression fittings for 3/8 ID 1/2″ OD tubing anymore.

    They have 3/8ID 5/8OD and 1/2″ ID 3/4″ OD but nnot what I needed.

    I was looking at ones that said they were 10mmID 12mmod, but it turns out they were hardtube parts with no inner barb :/

    I was planning on using what I got as spares in the future, but instead I just got 4x barb tubes for the cleaning

  38. So, out of the box having done only the following:

    1.) Updated BIOS to allow for Resizeable BAR/SAM
    2.) Disabled SMT (because apparently 3DMark hates SMT on the Threadripper

    I scored the following:

    [ATTACH type=”full” alt=”timespy_no_SMT.png”]1276[/ATTACH]

    Here I am going to note that a 22k graphics score in timespy is faster than anyone has posted in [URL=’https://hardforum.com/threads/6800-6900-overclock-results.2006009/’]the 6800/6900 overclocking thread[/URL] over at the [H], and this is at stock settings. I have changed nothing. All clock speed, voltage, and power settings are at their out of box defaults.

    I am also going to note, that I am not a fan of canned benchmarks. I don’t play 3DMark, and as such it is not reflective of what I will get in game. What it is though is a good reference point for me to know if everything is working right, as there is a literal ton of Timespy data out there to compare to.

    Either I have done something wrong in my testing, or I have the best 6900 ever in my possession.

    Also worth noting. It does run a little hot. Like not “air cooler” hot, but much hotter than my old Pascal Titan X under water. For these runs I set the fan profiles to keep the coolant at ~33C. On my Pascal Titan X this resulted in in game temps of about 38C core temp. This beast is hovering at about 50C to 51C. Part of this is not unexpected. The Pascal Titan was a 250W TDP card. The 6900XT stock is 300W, but who knows how high they set it stock at XFX? The specs do not say does not say. I’ll have to upgrade the monitoring software and measure during a run to see what it registers.

    I’m wondering if they did a shitty mounting/pasting job at the XFX factory, and if I should take off the block, and give it some Kryonaut goodness. (Really not looking forward to that…) Appreciate thoughts.

  39. I’d say if you have a potential world record holder of a card there, I would be leary to touch much of anything, but that’s just me

  40. To follow up, I collected some additional metrics.

    Based on the Radeon software’s measured wattage during a Time Spy run, the out of box Power Limit seems set to 335W. That’s the max power used I saw throughout the benchmark run, and it was fairly consistent, rarely dipping below 330.

    I moved the power slider all the way to the right to add my 15% and re-ran the benchmark. As would be mathematically expected, this resulted in max power used at 385W, though not as consistent, it bounced up and down more, which suggests that with the 335W limit it was hitting the limit a lot, but at 385W it is sometimes, but not other times.

    Increasing the power (but touching nothing else) yielded slightly higher numbers.

    [ATTACH type=”full” alt=”timespy_no_SMT_max_power.png”]1277[/ATTACH]

    It also resulted in a 3C increase in core temps to a max of 54C.

    I do want to continue tweaking, but I’m not 100% sure what my next best bet is. Maybe go for a higher core clock, while also trying to reduce voltage a little? And then getting MorePowerTool to override the power limit maybe? I’ll have to think about it.

    Oh, by the way, here are the default settings:

    [ATTACH type=”full”]1278[/ATTACH]

  41. Did some more playing around with settings, but admittedly I don’t really know what I’m doing. Here is the best I came up with:

    Min Freq: 2450
    Max Freq: 2650
    Voltage: 1130 (anything lower would crash the driver or 3DMark or both)
    Ram: 2125
    Power Limit: +15% (which results in 385w)

    Unlike other boards, the RAM slider on this one goes up to 3000. Doesn’t help much through. At 2200 I have stuttering and some mild artifacts and a SIGNIFICANLY reduced score. At 2175 the stutter and artifacts are gone, but the score is still very bad. 2150 scores very slightly below 2100. The sweet spot for me seems to be 2125 on the RAM.

    Not sure if the score losses are due to the RAM not performing well due to being pushed too hard, or if it is taking power away from the core due tot he power limit.

    Here is my best score thus far:

    [ATTACH type=”full”]1279[/ATTACH]

    I’m obviously happy with this, but it would be fun to push the graphics score into the 23000’s

    That’s all I have time for tonight. I’m considering playing around with MorePowerTool, but since I’m already at 385w, I’m not sure how much higher it is wise to go. I still have plenty of thermal headroom though. Core temp is 52C in my latest run, with Junction Temp at around 67C.

    Again, I’d appreciate any suggestions.

  42. Alright, I think I’ve reached the limit for this one, at least without going to extreme measures.

    Once I loaded up MorePowerTool I found that the stock power limit was 332W (odd choice) and since I had already tested +15%, that meant I had already tested up to 382W, so I just set 382 as the new base power limit (giving me the option to go +15% over that in the drivers.

    Best results I’ve been able to get thus far are at:

    MinFreq: 2575 Mhz
    MaxFreq: 2725 Mhz
    Volt: 1140mv
    VRAM: 2126Mhz
    Power Limit: +15% (439W)
    [URL=’https://www.3dmark.com/3dm/67184786′][ATTACH type=”full” alt=”clock_2725_Volt_1140_Mem_2126.png”]1280[/ATTACH][/URL]
    [URL=’https://www.3dmark.com/3dm/67184786′]link[/URL]

    Not too shabby if I may say so myself.

    I kept an eye on the power draw and it actually did draw all the way up to 437w at one point! No wonder this room is getting warm.

    I tried going up to a MaxFreq of 2750 but that resulted in crashes. I stepped the voltage all the way up to 1200mv but that didn’t help, and I didn’t really want to go above that.

    So I think I’ve had my fill of canned benchmarks for a while. Now I need to find myself a game to play :p

  43. Hopefully I’ll have time to install mine on Sunday. This looks promising if my card is as good as yours.

  44. Alright, so I finally got around to taking the cooler off this thing. Anyone ready for some nudez?

    I simply disconnected my QDC’s and did this with fluid still in the block. A little non-conventional, but I wasn’t planning on opening the block itself, so I figured it was fine. It did cause a few difficulties in getting a flat surface to work on, but I made it work.

    Also, I don’t usually use antistatic mats and wristbands, but I’ve learned more about ESD over time, and my old argument of “well in my 30 years of doing this I’ve never killed anything” doesn’t really mean anything. ESD is not all or nothing. You can zap something by toughin it just enough to weaken it, and have it fail as a result years later. This is also my most expensive component to date, so I figured better safe than sorry.

    [ATTACH type=”full” alt=”01.jpg”]1297[/ATTACH]

    [ATTACH type=”full” alt=”02.jpg”]1298[/ATTACH]

    So, first we have to take the massive backplate off. Seven screws is all it takes, one of them covered in one of those “Warranty void if removed stickers”

    [ATTACH type=”full” alt=”03.png”]1299[/ATTACH]

    Nice try XFX, we all know that is illegal by now.

    Side note, no idea why the PCIe contacts are so scratched. I can’t think of anything I might have done that would have done that. I’m guessing it must have happened at the plant?

    Either way, they work just fine, so I am not concerned.

    7 screws removed and some gentle prying with my fingers later and the backplate is off:

    [ATTACH type=”full” alt=”04.jpg”]1300[/ATTACH]

    [ATTACH type=”full” alt=”05.jpg”]1301[/ATTACH]

    Some pretty decent putty style thermal pads covering all the hot components on the back. I’d call this an A+ on XFX’s part. Doesn’t really get much better than this from what I have seen.

    Now we’ve revealed the ~16 (I think?) remaining screws that need to be removed to get the block off. In order to remember which screw holes were used by the backplate so I didn’t accidentally populate them during reassembly, I highlighted them in the next pic.

    [ATTACH type=”full” alt=”06.jpg”]1302[/ATTACH]

    Now, for the moment we’ve all been waiting for. The full frontal nudez.

    [ATTACH type=”full” alt=”07.jpg”]1303[/ATTACH]

    My two comments are as follows:

    1.) GPU looks well pasted. Maybe a little excess, but not too bad. I probably wasted my time in taking this off, but at least now I know. I’ve seen some horror stories so I had to make sure, especially since it was running a bit warmer than I was used to with my old Pascal Titan X. Maybe that’s just a 6900xt thing, or especially THIS 6900xt. It does pull a lot more power than my old Pascal Titan X did.

    2.) Those VRM’s DO NOT look like they are touching the thermal pads in the grooves on the block. There were no real indentations on the thermal putty/pads except for like two of them in the corner.

    [ATTACH type=”full” alt=”08.jpg”]1304[/ATTACH]

    [ATTACH type=”full” alt=”09.jpg”]1305[/ATTACH]

    [ATTACH type=”full” alt=”10.jpg”]1306[/ATTACH]

    Alright, so I wiped down the GPU area of the block and the GPU itself with isopropyl alcohol, and reapplied Thermal Grizzly Kryonaut using the “rub it on using a nitrile glove” method. Thermal Grizzly Kryonaut is pretty thick when cold, so I heated the tube with a hair dryer before starting to make it easier.

    I also rubbed some thermal grizzly on top of the VRM’s to see if I could make them contact the thermal pads. A little unorthodox I know, but I figured it was better than nothing. (I didn’t have any good thermal pads to replace them with.)

    The result?

    Despite the paste job looking pretty good as it was, the GPU does run a lot cooler now.

    At first I didn’t think it was. My first run through the temps were as high as they were previously, but I guess the Thermal Grizzly Kryonaut just needed to warm up and spread out a bit, because in the second run and beyond the temps settled down nicely.

    Previously at max overclock in TimeSpy I was hitting 56C core, 71C Junction, now that is down to 46C core, 60C Junction.

    At stock settings it was hitting ~51-52 core (can’t remember junction) so this is cooler even than stock settings.

    I have had one weird side effect through. I’ve had to back off about 50-75Mhz on the main clocks to remain stable. I’m not quite sure why. Temps are obviously better, and I don’t think I damaged anything, because these things are either dead or not if you damage them, not just slightly reduced max overclock.

    My best guess is that AMD’s automagic logic is trying to reduce the voltage due to the lower temps, but doing so too much resulting in lack of stability.

    I am going to have to tinker with the clocks and voltages to see if I can get back up to where I was.

  45. One final pic I couldn’t attach to the previous post due to the 10 image limit.

    I’m going to insert it above.

  46. I just bought the last one of these available in the country I live in. Will be interesting to see how it performs (and if I need to install larger radiators) 😛

  47. First impressions: this thing is heavy! I got two older graphics cards with fullcover EK blocks, but this takes it to a new level. For example, my MSi GTX1080ti Seahawk X EK is a featherweight in comparison.

  48. [QUOTE=”Zarathustra, post: 41974, member: 203″]
    I guess now it’s only a matter of time until I find out if it is a 3090 killer. Let me know if you have any benchmark requests…
    [/QUOTE]
    At stock speeds, the answer is going to be a definite “no.” The stock speeds being listed as 2525MHz on the card in question according to the Newegg listing. I’ve seen data concerning similar clock speeds against overclocked RTX 3090’s and all that does is close the gap between the two. Even so, there are still cases where the RTX 3090 cards can still have a substantial lead over the AMD 6900 XT’s.

    On the other hand, if these can reliably hit 3.0GHz or close to it that might be enough to give them a solid lead until ray tracing comes into play. In that case, the performance on the AMD side will tank as their implementation just isn’t anywhere near as good as NVIDIA’s is.

    In either case, I think you are getting an amazing card at a price that’s not too outlandish when you account for a factory overclock plus a factory waterblock like that. I mean, the base card would normally cost you $1,000. Being an AIB card with better VRM’s and a high factory overclock would increase the cost $200 more than likely. That block would no doubt be $300 from EKWB direct. All said and done, I think it’s priced right.

    If I hadn’t lucked on on my RTX 3090’s, I’d buy this thing without any hesitation.

Leave a comment