GIGABYTE Radeon RX 6500 XT EAGLE 4G video card front view

Introduction

Today, AMD is launching its new Radeon RX 6500 XT GPU, which is a completely new ASIC built from the ground-up on TSMCs new N6 manufacturing node.  This is the world’s first consumer 6nm GPU, and while that is surely impressive, what about the rest of the specs?  Are they enough to justify the new MSRP and pricing realm we find ourselves in today for what the 6500 XT offers?

First and foremost, the AMD Radeon RX 6500 XT is AMD’s answer to the entry-level to mainstream video card level market.  Think of the Radeon RX 6500 XT as an entry-level video card if you are just getting into gaming, or want to upgrade from an older video card like the Radeon RX 570 or GeForce GTX 1650.  In fact, you might even think of it as a means to move from integrated graphics performance to a discrete video card level graphics performance, at least, the most cost-effective way to do so. 

AMD Radeon RX 6500 XT

First, to the official MSRP, AMD has set the “SEP” of this video card at $199.  However, what you are going to actually find is that manufacturers, Add-in-Board partners, and the like are actually going to be priced much higher than this, then consider the wackiness of retail and online pricing.  The AIB pricing alone is simply going to be higher.  We believe finding this card at $199 will be rare, as the base price set from AIBs is going to be higher from the get-go, including the card we have for review for today.  You will only find this card as AIB models btw, this is an AIB-only card.

AMD is positioning this card as an upgrade from specifically a Radeon RX 570 or GeForce GTX 1650.  If you look at the slide deck you will find the performance comparisons based against those video cards.  While that is fine and all, we also feel it is important to compare this video card to the last generation’s Radeon RX 5500 XT to see what has transpired from generation to generation, as the Radeon RX 5500 XT is based on the RDNA1 architecture and the new Radeon RX 6500 XT is based on the new RDNA2 architecture.

As this is an entry-level to mainstream video card, it is targeted at the 1080p gameplay experience.  In addition, it is more apt to play games at 1080p at the Medium or High settings in current games.  This is not a video card for the “Ultra” or “Extreme” or “Maxed Out” settings at 1080p, rather, it’s a card for the medium level of gameplay settings at 1080p in most cases.

One of the unique features of the Radeon RX 6500 XT is that it is the first GPU out based on TSMC’s N6 (6nm) process.  The GPU is 107mm2 with 5.4 billion Transistors.  This has allowed AMD to hit a very high frequency, the fastest they claim, with the game clock on this GPU set at 2.6GHz, that’s fast!  You might even imagine hitting 3GHz on overclocks, that’s pretty incredible.

Hardware Specs

SpecificationAMD Radeon RX 6500 XTAMD Radeon RX 5500 XT
Manufacturing NodeTSMC N6TSMC N7
ArchitectureRDNA2RDNA
Compute Units1622
Ray Accelerators16N/A
Stream Processors10241408
Infinity Cache16MBN/A
ROPs3232
Game Clock/Boost Clock2610MHz/2815MHz1717MHz/1845MHz
Memory4GB GDDR64GB or 8GB GDDR6
Bus Width64-bit128-bit
Memory Frequency18GHz14GHz
Memory Bandwidth144GB/s224GB/s
PCIe InterfacePCIe 4.0 x4PCIe 4.0 x8
Media Encoder (VCE)NoYes
TDP107W/120W130W
MSRP$1994GB $169 | 8GB $199

As we mentioned above, the Radeon RX 6500 XT is based on the new RDNA2 architecture and therefore does feature Infinity Cache.  There is in fact 16MB of Infinity Cache on board, and boy does it need it with some of the constraints this video card has been given.  First and foremost, you’ll notice that the memory bus width is only 64-bit, yes, a 64-bit memory bus with this GDDR6 memory. 

That’s a very narrow bus, but remember the Infinity Cache is supposed to make up for it.  To also help make up for the narrow memory bus AMD is running the fastest GDDR6 available at 18GHz.  That provides 231.6GB/s of memory bandwidth.  It’s not a lot, but again the Infinity Cache is supposed to make up for that.  In addition, it has a small framebuffer, only 4GB on the Radeon RX 6500 XT, with no 8GB options like the Radeon RX 5500 XT had.

Other specifications about the GPU, it has 16 Compute Units and thus 16 Ray Accelerators for Ray Tracing.  It has 1024 Stream Processors, 32 ROPs, and runs at 2610MHz game clock and 2815MHz boost clock.  It has an official TDP or board power of 107W, but there also seems to be a secondary 120W potential some add-in-board partners may exploit.  This would mean potentially higher clock speeds on those parts, perhaps for factory overclocked video cards.

As this is based on the RDNA2 architecture, it also supports Ray Tracing. However, it only has 16 Ray Accelerators, so the Ray Tracing is more likely there to experiment with, and have the capability of it, but in terms of practical performance with it, it’s probably not going to be playable with Ray Tracing, unless you really lower in-game graphics settings.

The Constraints

There are a few other constraints to consider on this video card that you should know about. The Radeon RX 6500 XT will be built with a PCIe 4.0 x4 signaling interface to save cost.  That means it operates at PCI-Express 4.0 x4 connection, which is close to 8GB/s (a little less actually).  Since this video card is based on x4 signaling, and not x8 like the previous Radeon RX 5500 XT, on any PCI-Express 3.0 system platform it will still only operate at x4 signaling and cut the bandwidth in half, taking it down to less than 4GB/s.

It’s a major constraint that will affect users on PCI-Express 3.0 platforms, which there are many of.  You won’t get full bandwidth out of this card unless you are on a PCI-Express 4.0 or 5.0 system, which cuts a lot of users out who are on older systems, the kind of person who might want a video card upgrade.  Again, the Radeon RX 5500 XT was on a PCIe 4.0 x8 interface, so in PCI Gen3 it still operates at x8 lanes, double that of the 6500 XT. We’ll talk more about this later.

As if all that wasn’t enough, there is one more piece of information that is important.  The Radeon RX 6500 XT has ditched the AMD VCE encoding engine to save cost.  While it can decode H.264 and VP9 and H.265, it cannot decode AV1, nor can it encode anything in hardware.  That’s right, no encoding engine. 

There simply is no option to encode using the VCE, it will resort to CPU only for Encoding, with no GPU assist at all.  For any content creators who aren’t gamers, and looking for a cheap video card to offload video encoding, the Radeon RX 6500 XT will not have that functionality for you.  The weird thing is, the Radeon RX 5500 XT, last gens card, had it. Seems like a move backwards, to us.

Our Goals In Testing

While there is certainly a lot of constraints here, our goal today is to show you how the video card scales up and down in-game settings at 1080p. To do this in our review today we are going to play several games at 1080p and test from the Ultra or Ultimate settings for the game at 1080p, down to the Medium or Low setting, showing all the levels in-between. This way you can find out what level of gameplay it might provide for you and what kind of performance you will expect. Are you aiming for 60fps at 1080p? Our gameplay performance testing today should show you where that lies with this card.

Don’t Miss Out on More FPS Review Content!

Our weekly newsletter includes a recap of our reviews and a run down of the most popular tech news that we published.

Brent Justice

Brent Justice has been reviewing computer components for 20+ years, educated in the art and method of the computer hardware review he brings experience, knowledge, and hands-on testing with a gamer oriented...

Join the Conversation

29 Comments

  1. Nice that you got a review sample – that bodes well for availability.

    It may have drawbacks, but at this point I’d say anything that can stay on the shelves for near MSRP is going to be a huge improvement.

  2. More like 150% of MSRP for the out of stock listings I’ve seen – but that’s still an improvement! Somehow!

    I also appreciate the testing of the transcoding… err, ‘limited decoding’ block. I generally disapprove of Intel’s IGP-less ‘F’ SKUs because they cut off Quicksync along with the IGP, as well as AMDs release of mostly GPU-less desktop parts.

    I’m not about to spit on working, [I]available[/I] GPUs, but I think AMD cut just a few too many corners here – and they still didn’t get it down to the sub-75W level of power draw needed for powering cards directly through the PCIe slot.

  3. Wow, that was a little worse than I expected, especially considering it isn’t selling for $199 hardly anywhere.

  4. [QUOTE=”LazyGamer, post: 47171, member: 1367″]
    I’m not about to spit on working, [I]available[/I] GPUs, but I think AMD cut just a few too many corners here – and they still didn’t get it down to the sub-75W level of power draw needed for powering cards directly through the PCIe slot.
    [/QUOTE]

    [QUOTE=”Dogsofjune, post: 47182, member: 168″]
    Wow, that was a little worse than I expected, especially considering it isn’t selling for $199 hardly anywhere.
    [/QUOTE]

    Yeah, I think in “Days of Yore” this would have been the bottom tier GPU and gone in the $100-$120 budget range. But today – yeah, anything is better than nothing and the only real competition is whatever you can find scalped on Ebay.

  5. It is nice to see that you tested using FSR where the option was available. This is very useful, keeping in mind it is similar to about to be released RSR

    Do you plan on a separate testing for pci3 vs pci4 (idea is to see how the playable settings get reduced in pci3 vs pci4) ?

  6. [QUOTE=”Marees, post: 47229, member: 1536″]
    Do you plan on a separate testing for pci3 vs pci4 (idea is to see how the playable settings get reduced in pci3 vs pci4) ?
    [/QUOTE]
    There was the 3DMark comparisons on page 4, it goes through a couple of various PCI scenarios, but it only looks at changes in 3DMark. So there is some data.

    I’ll give you – that isn’t the same thing as gameplay. Might be neat to see how a couple of titles fair to see if there are major changes outside of a canned benchmark designed to exacerbate differences; I don’t know if it warrants going through the entire suite of tests again though.

  7. [QUOTE=”Brian_B, post: 47231, member: 96″]
    There was the 3DMark comparisons on page 4, it goes through a couple of various PCI scenarios, but it only looks at changes in 3DMark. So there is some data.

    I’ll give you – that isn’t the same thing as gameplay. Might be neat to see how a couple of titles fair to see if there are major changes outside of a canned benchmark designed to exacerbate differences; I don’t know if it warrants going through the entire suite of tests again though.
    [/QUOTE]
    Agreed… I think we can draw a relative conclusion from Brent’s data as to how gaming will be. Don’t expect to have the settings tuned up too high..

  8. Gamers Nexus was also similarly displeased. In a normal world, this would be a $100-140 card. 1080p gaming for the masses. This thing selling at retail for $250-300 should be a war crime.

  9. Very disappointing overall on the whole combination. At $200 MSRP, if even readily available for that price, with 1080p medium to low settings to make usable for current generation games does not look like a card that will stand the test of time in the end. For me, this is the worst GPU for this generation and makes even Nvidia look good, which I thought would be impossible for AMD to do (I was wrong). I do like the review overall, well done exposing the usability of this card with lower settings, FSR options and using Rebar which is readily available for usage on AMD and Intel systems. On Ebay one can buy used AMD Fury’s for less than $150 and have better performance, abeit at a higher power level and heat unless your lucky and nab a Nano for less than $150 which are 175w about cards.

  10. It’s pretty awful as far as I am concerned. It’s a card that’s basically only about as fast as a $200 card from five years ago and gimped in a way that will make it age far worse. It’s already being scalped for $500+ on eBay. Meanwhile, on eBay there are cheaper alternatives on the used market that will provide a vastly superior gaming experience. Even at $200 I think it’s a non-starter given it’s essentially a waste of engineering time and effort that does nothing to advance the already stagnant segment.

    Frankly, I see absolutely no reason for this GPU to exist in its current form.

    That’s my $0.02 anyway.

  11. This I find amusing.

    [URL unfurl=”true”]https://wccftech.com/amd-removes-4gb-vram-is-not-enough-for-games-marketing-hours-before-radeon-rx-6500-xt-4-gb-launch/[/URL]

  12. [QUOTE=”Dogsofjune, post: 47272, member: 168″]
    This I find amusing.

    [URL unfurl=”true”]https://wccftech.com/amd-removes-4gb-vram-is-not-enough-for-games-marketing-hours-before-radeon-rx-6500-xt-4-gb-launch/[/URL]
    [/QUOTE]
    I think this is a no-win situation for AMD really.

    Had they put more than 4G of VRAM on here, it would just be mining fodder – having only 4G is the only chance it had of staying remotely available. It’s not like the “hash limiter” nVidia has out there does anything to deter miners or make the cards any more available.

    I rail against nVidia for their hash limiter being just a paper shield, and for putting out old video cards again. That doesn’t really fix anything. And I’ll say the same thing here – as Dan_D says, this isn’t any better than just re-releasing a RX580, cutting the VRAM capacity, but leaving the price the same. But availability would be a step in the right direction.

    I don’t know what the right answer is to be truthful. But I’ve seen an awful lot of what I’m pretty sure are the wrong answers.

  13. Ethereum is memory bandwidth sensitive, the 6500XT measly 64bit memory bus probably makes it a worthless mining even at 8gb onboard memory. The 6500XT I would say has no reason to exist, it does everything poorly.

  14. [QUOTE=”Dan_D, post: 47270, member: 6″]
    Frankly, I see absolutely no reason for this GPU to exist in its current form.

    [/QUOTE]
    The reason is easy. $$$$. Nvidia did almost the same thing with the RTX 3050, it’s a gimp and not worth the cash either.

  15. So… if you consider the current street prices (and forget history) for all currently produced GPUs, the 6500 XT family is priced for its performance right where it should be, assuming that street price ends up in the $200-300 range.

    Going by a browsing of sold ebay items, RTX 3060’s are selling for $700-800, RX 6600 XTs are selling for about $600 and the 6600 non-XT are selling for $600. Of course, the RTX 3050 is TBD with respect to street pricing. Therefore, in theory, if the street price even settles at $300-350, you’re still paying HALF what you would for the next step up, which gives you a choice – Am I OK with low-medium 1080p settings in current AAA titles and high settings for older/low graphics intensive games? Or should I spend twice as much for a better 1080p experience?

    If you go by historic prices and market placement, this is a low $100 MSRP card (and I could go on and on about where all of these cards _should_ be priced in the current reality). Unfortunately, this is the reality we’ve been dealing with for a couple of years and I don’t see that changing anytime soon unless maybe the Eth price keeps cratering and stays cratered like it has been this week.

  16. [QUOTE=”David_Schroth, post: 47323, member: 1″]
    So… if you consider the current street prices
    [/QUOTE]
    This is the biggest party pooping post I’ve seen in a long time It’s also 100% true, which makes it all the more depressing.

    This post makes me want to go start drinking at 9am.

  17. And there are no where to be found, fail of all fails. The cheapest, easiest stripped down card ever produced for this generation and AMD cannot make them or AIBs in sufficient numbers for the desperate of desparates of the PC gamers, strapped for some kind of solution. AMD cannot manufacture or managed to get manufacture their products -> pretty big red flag. I hope Intel with their actual manufacturing capability can come through. Few more years of this, I see the killing of PC gaming, PCVR etc. Shift to services, as in computing/gaming just shifting to cloud based services and specialty hardware like game consoles and more capable TVs making PCs much less relevant. Where your processing needs are fully fulfilled and controlled by big tech companies, as long as you go along with the party line and not get kicked off.

  18. [QUOTE=”noko, post: 47328, member: 69″]
    I hope Intel with their actual manufacturing capability can come through.
    [/QUOTE]
    They’re stealing it from AMD at TSMC :ROFLMAO:

    Still, Intel is probably the only party that can do anything about prices – assuming they bring real availability, they’re probably going to want to buy market share by sacrificing margins for their first few ‘enthusiast’ generation releases, and that’s going to put downward pressure on Nvidia and AMD.

    Supposing Intel does that, I don’t expect them to continue forever as they do very much like their margins, but they may very well help reset prices somewhat.

  19. [QUOTE=”LazyGamer, post: 47333, member: 1367″]
    They’re stealing it from AMD at TSMC :ROFLMAO:

    Still, Intel is probably the only party that can do anything about prices – assuming they bring real availability, they’re probably going to want to buy market share by sacrificing margins for their first few ‘enthusiast’ generation releases, and that’s going to put downward pressure on Nvidia and AMD.

    Supposing Intel does that, I don’t expect them to continue forever as they do very much like their margins, but they may very well help reset prices somewhat.
    [/QUOTE]
    lol, yep, another TSMC product waiting in line. I would expect Intel to concentrate on OEM builds for ARC first while retail gets the scraps. They have a number of upcoming versions, if any will be on Intel process besides the iGPU versions I do not know. I would hope Intel can take control of their manufacturing process for these, maybe TSMC is just that much better process wise to give a chance for Intel to compete in GPUs.

  20. [QUOTE=”noko, post: 47334, member: 69″]
    lol, yep, another TSMC product waiting in line. I would expect Intel to concentrate on OEM builds for ARC first while retail gets the scraps.
    [/QUOTE]
    I read this sentence with a touch of sadness – I don’t want it to be true, but I have to admit that it comes across as entirely plausible.
    [QUOTE=”noko, post: 47334, member: 69″]
    They have a number of upcoming versions, if any will be on Intel process besides the iGPU versions I do not know. I would hope Intel can take control of their manufacturing process for these, maybe TSMC is just that much better process wise to give a chance for Intel to compete in GPUs.
    [/QUOTE]
    It’s not that Intel can’t make the discrete GPUs that they’re farming to TSMC, but rather that while Intel is upgrading their own fabs to catch back up in CPU production, TSMC has the better fabrication technology for Intel’s GPUs. It probably helps that TSMC has had decades of experience fabbing ATi / Nvidia / AMD GPUs as well.

    Whether Intel will dedicate fab capacity in the future toward producing their own discrete GPUs is anyone’s guess. I’d think that they’d want to, but they may not know themselves whether that will be more cost efficient than using TSMC or other third-party fabs.

  21. [QUOTE=”LazyGamer, post: 47333, member: 1367″]
    They’re stealing it from AMD at TSMC :ROFLMAO:

    Still, Intel is probably the only party that can do anything about prices – assuming they bring real availability, they’re probably going to want to buy market share by sacrificing margins for their first few ‘enthusiast’ generation releases, and that’s going to put downward pressure on Nvidia and AMD.

    Supposing Intel does that, I don’t expect them to continue forever as they do very much like their margins, but they may very well help reset prices somewhat.
    [/QUOTE]
    I don’t think Intel will be our white knight here – so far they have signaled their GPUs will be open for business for mining. Which means just more blood for the blood god.

  22. [QUOTE=”Brian_B, post: 47340, member: 96″]
    I don’t think Intel will be our white knight here – so far they have signaled their GPUs will be open for business for mining. Which means just more blood for the blood god.
    [/QUOTE]
    Maybe Intel can out blood AMD and Nvidia. AMD with TSMC and Nvidia with Samsung at this point is not even coming close to satisfying the demand and also pissing many folks off. Still doubtful TSMC will do much better with Intel.

  23. [QUOTE=”noko, post: 47347, member: 69″]
    Maybe Intel can out blood AMD and Nvidia. AMD with TSMC and Nvidia with Samsung at this point is not even coming close to satisfying the demand and also pissing many folks off. Still doubtful TSMC will do much better with Intel.
    [/QUOTE]
    Nvidia’s back to TSMC for the RTX 4000-series too.

    I’m wondering when Samsung will pick up another run. They are fabbing AMD GPUs into their new Exynos SoCs, so maybe AMD is willing to take a chance on them for some parts?

  24. Intel is very serious, except in this case 2025 does not help much now

    [URL unfurl=”true”]https://time.com/6140476/intel-building-factory-ohio/[/URL]

  25. [QUOTE=”LazyGamer, post: 47348, member: 1367″]
    Nvidia’s back to TSMC for the RTX 4000-series too.

    I’m wondering when Samsung will pick up another run. They are fabbing AMD GPUs into their new Exynos SoCs, so maybe AMD is willing to take a chance on them for some parts?
    [/QUOTE]
    AMD needs to do something to grow, increasing prices and not dramatically the number of chips, hitting a wall so to speak will not go well over time. So yes, Samsung I would think could/should be used on some skews to increase supply.

  26. Well, I checked the local MC….. These nasty assed 6500’s are in stock for $199. Ugh….
    [URL unfurl=”true”]https://www.microcenter.com/product/646514/asrock-amd-radeon-rx-6500-xt-phantom-gaming-d-overclocked-dual-fan-4gb-gddr6-pcie-40-graphics-card?sku=370973&utm_source=20220121_A_eNews_Computer_Parts_R6448&utm_medium=email&utm_campaign=R6448&MccGuid=C1BA145D-CE5D-49F6-950F-AF13D4AA7FFF[/URL]

  27. [QUOTE=”David_Schroth, post: 47323, member: 1″]
    So… if you consider the current street prices (and forget history) for all currently produced GPUs, the 6500 XT family is priced for its performance right where it should be, assuming that street price ends up in the $200-300 range.

    Going by a browsing of sold ebay items, RTX 3060’s are selling for $700-800, RX 6600 XTs are selling for about $600 and the 6600 non-XT are selling for $600. Of course, the RTX 3050 is TBD with respect to street pricing. Therefore, in theory, if the street price even settles at $300-350, you’re still paying HALF what you would for the next step up, which gives you a choice – Am I OK with low-medium 1080p settings in current AAA titles and high settings for older/low graphics intensive games? Or should I spend twice as much for a better 1080p experience?

    If you go by historic prices and market placement, this is a low $100 MSRP card (and I could go on and on about where all of these cards _should_ be priced in the current reality). Unfortunately, this is the reality we’ve been dealing with for a couple of years and I don’t see that changing anytime soon unless maybe the Eth price keeps cratering and stays cratered like it has been this week.
    [/QUOTE]
    Prices aside, for the moment, just based on the hardware specifications, engineering, and what market this GPU is aimed at, it’s wholly under-engineered, with too many cuts for even a dedicated desktop PC gaming video card.

    This is backed up by the fact that we now know this was intended as a Laptop GPU, to go alongside an APU which would already have VCE capabilities, and thus only require limited output connections, and limited PCIe bandwidth, again it was meant to go alongside Rembrandt 6000 series APUs.

    They basically pushed a Laptop GPU to the extreme to make it a desktop dedicated GPU, wrapped it up in that package, and so, therefore, in that regard, it is completely behind the curve for such a role. It is now backward, with older GPUs offering features it lacks.

    This further demonstrates to me the fall of low-end GPUs on the desktop and backs up the idea that this market is fading for new GPU IPs.

Leave a comment