AMD Radeon RX 6800 XT Radeon Logo

Introduction

The wait is finally over, what has seemed like ages, AMD has finally released a series of new GPUs aimed at the high-end gaming enthusiast level of performance.  The performance to finally provide an enjoyable 4K gaming experience, something which AMD has never offered before. 

For the last, very long while, years even, NVIDIA has dominated this performance segment.  AMD instead, decided to ignore this level of performance and focus on the mainstream segment.  This left competition void in the very high-end, and NVIDIA simply had the fastest video card and best 4K experience with the GeForce RTX 2080 Ti reigning for literally years, since 2018.  AMD offered no competition whatsoever.  AMD offered no 4K gaming video card.  Is this truly AMD’s comeback?  Let’s find out.

History

Let’s do a little history before we dive into the new Radeon RX 6000 series video cards based on RDNA2.  Let’s first travel back to AMD’s most recent video card launch in July of 2019 with the AMD Radeon RX 5000 series.  The Radeon RX 5000 series was based on AMD’s RDNA (1st generation RDNA) architecture.  However, something you should know about RDNA1, it did share some leftovers from the previous GCN 5.0 Vega architecture.  It was more of a hybrid architecture, moving away from Vega, but not quite delivering truly next-generation features and performance yet.  It was a good attempt, and a big efficiency upgrade from Vega, but not as big of an upgrade as RDNA2 architecture is which today’s Radeon RX 6000 series video cards are based on. 

The Radeon RX 5700 XT, the most expensive video card out of the bunch debuted at $399.  This was as fast and expensive as it got for AMD with RDNA1 on the PC in the last launch.  At this pricing, it compared to the NVIDIA GeForce RTX 2060 SUPER in pricing.  AMD did not offer a 4K performance video card, or anything to rival the GeForce RTX 2080, RTX 2080 SUPER, and RTX 2080 Ti.  Those video cards simply dominated Radeon RX 5700 XT performance as they were meant to provide absolutely the fastest performance.  Then the GeForce RTX 2080 Ti offered the best 4K experience.

If we travel back even further in 2019 to the launch prior to the Radeon RX 5000 series we come to a unique launch, the Radeon VII in February of 2019.  This video card was unique in many ways, as it was AMD’s first 7nm based GPU for the consumer.  This was a big bold step forward.  It also uniquely had 16GB of HBM2 memory, another bold move.  However, these innovations could not save the fact that the GPU was based on the older GCN 5.0 Vega architecture, which is two generations old at this point.  This really held the performance back in gaming, and despite 16GB of HBM2 memory, it was not the card for 4K either despite the $699 price tag. 

In fact, the nature of this video card was based on a workstation-class GPU from AMD, and it was clear the DNA of Radeon VII was tied to that platform.  Though the Radeon VII launched at $699, the same price as the GeForce RTX 2080 and RTX 2080 SUPER, it was not capable performance-wise to provide an enjoyable 4K experience.  It was nowhere near RTX 2080 Ti levels of performance which had launched prior to it in September of 2018.  It just could not compete on performance.  It offered a decent 1440p experience, and that was all.  In the end, the video card was short-lived with AMD discontinuing the video card only five months from the launch. 

But if you look at it from a price perspective the Radeon VII was the last GPU AMD launched at the higher $699 price point.  And now, the new Radeon RX 6800 XT will replace that price point at a bit lower of a price in fact.

Radeon RX 6000 Series Announcement

Here we are to today, November 18th, 2020, what may have been a bad year for pretty much everyone, may end up being a good year for AMD in the end.  What is being launched today is the Radeon RX 6800 XT and Radeon RX 6800 video cards.  These new video cards are based on the totally new RDNA2 architecture, gone are the legacy components of GCN and Vega.  RDNA2 brings new hardware features such as Ray Tracing, and Infinity Cache, and DirectX 12 Ultimate support. 

Finally, AMD is aiming high again placing these video cards in a higher performance segment.  The Radeon RX 6800 XT will retail for $649, and the Radeon RX 6800 will retail for $579. 

RDNA 2

Let’s dive into the RDNA2 architecture and see what makes these GPUs tick before we dive into the specs on both.  The RDNA2 architecture is designed for performance and a big upgrade in performance per Watt, or efficiency.  It is also geared for DX12 Ultimate support and supports all the new features in DX12 Ultimate.  The design was made also so that RDNA2 can obtain high frequencies and integrates a new type of Cache.  The manufacturing process is the same TSMC 7nm, but AMD has improved the design to hit a higher frequency target, as well as greater density with more CUs and added features.

One of the most important new features in RDNA2 is the inclusion of a new Cache AMD calls Infinity Cache.  This is a 128MB portion of embedded memory that allows all CUs to directly address inside the GPU.  The goal is to improve performance per Watt, reduce latency, and optimize for high internal frequencies.  This is one major new feature that separates RDNA2 from previous generations and we expect to see it in future generations, ever-expanding and being optimized.

The goal to make AMD RDNA 2 a highly power-efficient architecture resulted in the creation of the AMD Infinity Cache – a cache level alters the way data is delivered in GPUs. This global cache allows fast data access and acts as a massive bandwidth amplifier, enabling high performance bandwidth with superb power efficiency.
A highly optimized on-die cache results in frame data delivered with much lower energy per bit. With 128MB of AMD Infinity Cache, up to 3.25x effective bandwidth of 256-bit of GDDR6 is achieved, and when adding power to the equation, up to 2.4x more effective bandwidth/watt vs 256-bit GDDR6 is achieved.

RDNA2 also includes some new hardware, it has dedicated hardware Ray Accelerators, one per CU.  The Ray Accelerator handles intersections of rays with the BVH and sorting of ray intersection times.  Each Ray Accelerator is capable of 4 Ray/Box intersections processed per CU per clock and 1 Ray/Triangle intersection processed per CU per clock.  Traversal of the BVH and shading of ray results are handled by the shader code running on the CUs.  The Infinity Cache can hold a high percentage of the BVH working set, reducing intersection latency.  RDNA2 supports DX12 Ultimate, DXR for Ray Tracing and Variable Rate Shading, Mesh Shaders and Sampler Feedback.

The power savings result in the Radeon RX 6800 XT being a 300W TDP video card, and the Radeon RX 6800 a 250W TDP video card.  AMD does state that these are capable of 4K gaming.  AMD is working on a Super Resolution feature similar to NVIDIA’s DLSS, but it is not yet implemented, that will be a future addition.

Smart Access Memory

AMD supports a new feature it calls Smart Access Memory.  However, it is actually a PCI-Express feature spec.  The PCI-Express feature is called Resizable PCI BAR.  It is a feature that can be enabled on a platform basis in PCI-Express 4.0, and possibly even down to PCI-Express 3.0.  Right now, AMD is utilizing the feature but you need an AMD X500 series motherboard, a Zen 3 CPU, and a Radeon RX 6000 series GPU to enable it.  You will be able to download a new BIOS for your motherboard where it will be required to turn the feature on.  By default, the feature will be off.  AMD’s plans call for this option to be enabled by default in future motherboard releases.

AMD made an announcement recently clarifying information on this feature. AMD states the feature is not proprietary and can work with other hardware.  NVIDIA also chimed in on the feature and stated that it too can support this PCI-Express feature in GeForce RTX 30 series and that it will in the future with similar performance results.  Performance results reported by AMD are in the single-digit percentage range (5-6% on average) for uplift.  Therefore, right now, the feature is disabled by default, requires a specific hardware combination (at the moment) with a new BIOS update and you must turn it on manually in the BIOS.  This may change in the future as it is adopted and turned on by default and supported by both AMD and NVIDIA.

Recent Posts

Brent Justice

Brent Justice has been reviewing computer components for 20+ years, educated in the art and method of the computer hardware review he brings experience, knowledge, and hands-on testing with a gamer oriented...

Join the Conversation

62 Comments

  1. Sorry, the wait ain’t over.
    No one in the public domain was able to purchase a GPU.
    The bots won again.
    I guess I’ll wait until Easter……..
  2. I knew I was not going to be able to get my hands on Zen3 or 6800XT this year, but it makes me very excited for the system rebuild next year. It’s been years since an all AMD system, and I’m rather looking forward to it.
  3. Thank you Brent for an awesome review! Someday I’ll be able to get my hands on either this or the 3080 from nVidia.
  4. Seems to me the 16gb didn’t make much of a difference if at all

    AMD should feel comfortable releasing an 8gb RX6800 and go head 2 head with the RTX3070 and still beat it.

    1. Poll on Hardforum…..like 2 people.
      For about 2 minutes the plain 6800 was available.
      I tried to get an XT……they were honestly gone before you could advance to your cart.
      Pathetic launch…….oh well, not life or death.
  5. Speaking of which, I will say the card feels nice in my hands, the texturing of the material is quite nice to feel. It’s very solid and heavier than I thought it would be.

    Is that sarcasm? The card will be sitting in the case. Nobody will be stroking it.

  6. After all the hype, this feels like a let-down.

    Really? how so?

    They are so much better than I anticipated, specially the 6800.

    BTW looking at other reviews, the RTX3700 meets or beats the 2080Ti in most games contrary to FPS data. I know they are not really comparable, just saying.

  7. Really? how so?

    They are so much better than I anticipated, specially the 6800.

    BTW looking at other reviews, the RTX3700 meets or beats the 2080Ti in most games contrary to FPS data. I know they are not really comparable, just saying.

    Here’s how: Nvidia lowered prices this generation, and delivered better than expected improvements, AMD raised prices and is clearly lagging Nvidia, even though they have a better process and more VRAM.

    RTX 3070, you mean. No it does not beat 2080Ti from what most people have posted.

  8. I’m puzzled for the performance on FS, I guess its a driver issue maybe. I expected this was the game where the 16gb of ram would make a difference.
  9. Here’s how: Nvidia lowered prices this generation, and delivered better than expected improvements, AMD raised prices and is clearly lagging Nvidia, even though they have a better process and more VRAM.

    RTX 3070, you mean. No it does not beat 2080Ti from what most people have posted.

    In our testing the GeForce RTX 3070 Founders Edition used less power than a GeForce RTX 2080 SUPER FE, and GeForce RTX 2080 Ti FE. This is important because the performance exceeded the GeForce RTX 2080 SUPER FE and matched the GeForce RTX 2080 Ti FE for the most part.

    Matched would be a better term. Beat, well, in some ways, but match works better.

    Full review found here
    Quote taken from conclusion.

  10. This just means when I am ready to buy there isna slightly better chance I can get a card at the tier I want.
  11. I had set aside about $600 for myself for this winter – it was going to either a GPU or console, but I’m about to take it and just go blow it on H&B and I think I’ll come out way ahead for 2020.
  12. I had set aside about $600 for myself for this winter – it was going to either a GPU or console, but I’m about to take it and just go blow it on H&B and I think I’ll come out way ahead for 2020.

    Be careful… Scalpers are broaching into that territory too!

  13. Yep now when I have the budget for a new card I can go to a shop and ask for a base RTX 3080, OR a 6800xt!! I’m sure ONE of those will be in…. right… uhhhh right?
  14. More than likely, yes. The 3070 only has 8GB of VRAM, while the 2080Ti has 11GB. My guess would be, if VRAM was equal, the frame rates would be relatively equal as well.
  15. Great card if all you play is DiRT 5 🤣.

    Seriously, though. Seems perfectly competitive in rasterization with NVIDIA and the 3080, but ray tracing looks rather poor. Maybe next time, AMD.

    I’m puzzled for the performance on FS, I guess its a driver issue maybe. I expected this was the game where the 16gb of ram would make a difference.

    Flight Simulator is heavily CPU-dependent. It will still use as much VRAM is available, but it really shouldn’t affect FPS to a large degree. The highest I’ve seen the game use in various reviews was about 12GB on a 16GB Radeon VII with the dense terrain setting at 4K.

  16. Great card if all you play is DiRT 5 🤣.

    Seriously, though. Seems perfectly competitive in rasterization with NVIDIA and the 3080, but ray tracing looks rather poor. Maybe next time, AMD.

    Flight Simulator is heavily CPU-dependent. It will still use as much VRAM is available, but it really shouldn’t affect FPS to a large degree. The highest I’ve seen the game use in various reviews was about 12GB on a 16GB Radeon VII with the dense terrain setting at 4K.

    Thing is that the RTX3070 with "only"8GB beats the 6800 by a fair margin and is pretty close to the 6800XT. I was expecting the 6800 series take the lead here by quite a bit. So I’m guessing its a driver issue.

  17. I’d be curious to see how the AMD cards do with WoW’s RT. AMD has it’s Logo all over this with Blizzard.

    Some old friends of mine hijacked me back in to that game for the SL expansion, and I get about a 25% perf hit with RT on "High" at 4K with a RTX 2070. Of course it doesnt matter, it’s WoW and I’m on a 4K60 monitor. Just curious.

  18. Thing is that the RTX3070 with "only"8GB beats the 6800 by a fair margin and is pretty close to the 6800XT. I was expecting the 6800 series take the lead here by quite a bit. So I’m guessing its a driver issue.

    It would be interesting to check if SAM boosts performance in FS significantly

  19. Video of Hangar 21 ray tracing demo is out. Looks like they’re not ray tracing the entire scene.

    Yup, for example, they are using screen space reflections in some scenes and Ray traced in others.

    I assume this will be the strategy for AMD with game devs. Use regular effects like shadows, SSAO, SSR for the most part and raytaced effects en select parts.

  20. Yup, for example, they are using screen space reflections in some scenes and Ray traced in others.

    I assume this will be the strategy for AMD with game devs. Use regular effects like shadows, SSAO, SSR for the most part and raytaced effects en select parts.

    Isn’t that how it’s always been done?

  21. Isn’t that how it’s always been done?

    Ray-tracing has been selectively applied here, as it has to be for anything approaching real-time.

    This just seems even more selective; it’s not necessarily a bad thing and it pretty much has to happen in order for many genres to even consider implementing it. It just means more developer effort, not just on the engine side of things, but heavy involvement on the content side too.

  22. Isn’t that how it’s always been done?

    Well, sort off. but most games for example use either raytracing or screen space refrections, HBAO/SSAO or raytraced ambien occlusion, not both at the same time.

    I don’t think its wrong. Most of the time you don’t even notice the raytraced effects, so its a waste of resources to use them everywhere.

    On a side note I think its pretty impressive how AMD did RT on its first try. Technically speaking it seems slower than Turing but the extra overall performance makes up for it.

  23. Well, sort off. but most games for example use either raytracing or screen space refrections, HBAO/SSAO or raytraced ambien occlusion, not both at the same time.

    I don’t think its wrong. Most of the time you don’t even notice the raytraced effects, so its a waste of resources to use them everywhere.

    On a side note I think its pretty impressive how AMD did RT on its first try. Technically speaking it seems slower than Turing but the extra overall performance makes up for it.

    Ray tracing helps performance for shadows (example of World of Tanks a DX11 game where it is implemented using CPU)

    For illumination, I thing UE5’s hybrid approach is a better fit

    Reflections, I hate them. Already the monitor is glass, now the reflections will make everything more shiny

  24. Ray tracing helps performance for shadows (example of World of Tanks a DX11 game where it is implemented using CPU)

    For illumination, I thing UE5’s hybrid approach is a better fit

    Reflections, I hate them. Already the monitor is glass, now the reflections will make everything more shiny

    Reflections are a huge part of everyday life. Literally everywhere.
    Reflections also are more often than not not shiny.

  25. Kind of disappointed in the overclocking of these cards. Yeah, the clocks can go high as hell, but that doesn’t translate to performance increases.

    Maybe it’s driver related and we’ll see some improvements over the coming months. But, I’m kind of doubtful about that.

  26. Ray tracing helps performance for shadows (example of World of Tanks a DX11 game where it is implemented using CPU)

    For illumination, I thing UE5’s hybrid approach is a better fit

    Reflections, I hate them. Already the monitor is glass, now the reflections will make everything more shiny

    what I hate is making every reflective surface a mirror, even puddles HECK!!! mirror like pavement!!!. Yes I’m looking at you watch dogs legion.

  27. Kind of disappointed in the overclocking of these cards. Yeah, the clocks can go high as hell, but that doesn’t translate to performance increases.

    Maybe it’s driver related and we’ll see some improvements over the coming months. But, I’m kind of doubtful about that.

    I think it’s just Boost is already using all the headroom – it’s already giving you pretty much all it’s got.

  28. what I hate is making every reflective surface a mirror, even puddles HECK!!! mirror like pavement!!!. Yes I’m looking at you watch dogs legion.

    Puddles ARE mirrors. And have you ever seen wet streets in a city at night?

  29. Puddles ARE mirrors. And have you ever seen wet streets in a city at night?

    Yes. A lot of surfaces are relective. Most aren’t super pristine, not even mud puddles – as they usually have ripples or dirt or something else in them.

    The effect should be more like Ambient Occlusion. Yes it reflects lights and images, but not with perfect clarity. Even my wood floors have some reflection, with a matte-type finish. It just isn’t perfectly shiny or mirror-like.

  30. Puddles ARE mirrors. And have you ever seen wet streets in a city at night?

    No they are NOT, water is nowhere near as reflective as a mirror. I think a major mistake most games are making with reflective surfaces are treating them as perfectly flat, polished mirrors. It ends up being detrimental to IQ and doesn’t look realistic at all.

    I’ve said before that Control is IMO the best implementation so far in spite of the water puddles (someone please fire the janitor :LOL: :LOL: )
    From what I’ve seen in the gameplay demos, Cyberpunk will be even better.

  31. That RX 6800 is a interesting card. I honestly could see myself getting one to play with, if I could find it haha.
Leave a comment