Gameplay Performance Continued

Godfall

GIGABYTE Radeon RX 6500 XT EAGLE 4G Godfall

We are using Godfall above, and the built-in quality presets. We have also included performance with AMD FX Super Resolution FSR enabled at each setting as well to compare how FSR helps performance. Ray Tracing is disabled.

Starting off at Epic quality, this game is not playable at all on any graphics card here at 1080p. It was especially choppy on the Radeon RX 6500 XT, despite the average framerate, the benchmark started off in the teens of FPS, it was pretty bad. Adding FSR did not resolve the issue. Moving down to High settings helped a lot, and the game was more tolerable, but ultimately it did not perform any faster on the RX 6500 XT than it did the RX 5500 XT at High settings. Even turning on FSR was still around the same performance.

Only at the Medium setting did the Radeon RX 6500 XT stand apart from the Radeon RX 5500 XT, by being about 7% faster. Medium settings are definitely the way to go here in this game on the Radeon RX 6500 XT. The RX 6500 XT was 42% faster than the GTX 1650. While FSR does help at High, we still experienced some stuttering on the Radeon RX 6500 XT until we moved down to Medium settings.

Horizon Zero Dawn

GIGABYTE Radeon RX 6500 XT EAGLE 4G Horizon Zero Dawn

Horizon Zero Dawn now supports FSR as well, so we will utilize it here and see how much it improves performance. At Ultimate it appears that 62FPS is good, but there was still some hitching even with this good average number. Once we moved down to Favor Quality or Original then the game was super smooth. Overall, once again, the GIGABYTE Radeon RX 6500 XT EAGLE doesn’t offer much in the way of any performance improvement over the Radeon RX 5500 XT. It’s ever so slightly faster, but just barely, the FSR performance is exactly the same. It is much faster than the GeForce GTX 1650, so it has some big performance improvements there, but compared to last gens RX 5500 XT, no advantage.

Microsoft Flight Simulator 2020 GOTYED

GIGABYTE Radeon RX 6500 XT EAGLE 4G Microsoft Flight Simulator 2020 Game of the year Edition

Lastly, we are using the newest Microsoft Flight Simulator 2020 Game of the Year Edition. We have also enabled the new DX12 API mode in the game. This game is severely bottlenecked at Ultra settings at 1080p on all these video cards, they are no different in terms of bottlenecking at that high setting. Only when we move down from one setting to the High-End quality setting do we see some differences. The GTX 1650 is not playable, but the RX 5500 XT, and more so the RX 6500 XT becomes playable. This game is playable at lower framerates, but it’s clear the new Radeon RX 6500 XT has an advantage in this game.

To get higher framerates you’ll want to move down to Medium settings. You can hit nearly 60FPS average on the new GIGABYTE Radeon RX 6500 XT EAGLE 4G, whereas the older Radeon RX 5500 XT could only manage around 49FPS average. It gets even better at Low-End settings, where you can hit some really high performance in this game on the Radeon RX 6500 XT. However, Low-End setting in this game looks like crap, we recommend not going below Medium settings in this game. At Medium settings, the newer Radeon RX 6500 XT is definitely going to provide a better experience.

Don’t Miss Out on More FPS Review Content!

Our weekly newsletter includes a recap of our reviews and a run down of the most popular tech news that we published.

Brent Justice

Brent Justice has been reviewing computer components for 20+ years, educated in the art and method of the computer hardware review he brings experience, knowledge, and hands-on testing with a gamer-oriented...

Join the Conversation

29 Comments

  1. Nice that you got a review sample – that bodes well for availability.

    It may have drawbacks, but at this point I’d say anything that can stay on the shelves for near MSRP is going to be a huge improvement.

  2. More like 150% of MSRP for the out of stock listings I’ve seen – but that’s still an improvement! Somehow!

    I also appreciate the testing of the transcoding… err, ‘limited decoding’ block. I generally disapprove of Intel’s IGP-less ‘F’ SKUs because they cut off Quicksync along with the IGP, as well as AMDs release of mostly GPU-less desktop parts.

    I’m not about to spit on working, [I]available[/I] GPUs, but I think AMD cut just a few too many corners here – and they still didn’t get it down to the sub-75W level of power draw needed for powering cards directly through the PCIe slot.

  3. Wow, that was a little worse than I expected, especially considering it isn’t selling for $199 hardly anywhere.

  4. [QUOTE=”LazyGamer, post: 47171, member: 1367″]
    I’m not about to spit on working, [I]available[/I] GPUs, but I think AMD cut just a few too many corners here – and they still didn’t get it down to the sub-75W level of power draw needed for powering cards directly through the PCIe slot.
    [/QUOTE]

    [QUOTE=”Dogsofjune, post: 47182, member: 168″]
    Wow, that was a little worse than I expected, especially considering it isn’t selling for $199 hardly anywhere.
    [/QUOTE]

    Yeah, I think in “Days of Yore” this would have been the bottom tier GPU and gone in the $100-$120 budget range. But today – yeah, anything is better than nothing and the only real competition is whatever you can find scalped on Ebay.

  5. It is nice to see that you tested using FSR where the option was available. This is very useful, keeping in mind it is similar to about to be released RSR

    Do you plan on a separate testing for pci3 vs pci4 (idea is to see how the playable settings get reduced in pci3 vs pci4) ?

  6. [QUOTE=”Marees, post: 47229, member: 1536″]
    Do you plan on a separate testing for pci3 vs pci4 (idea is to see how the playable settings get reduced in pci3 vs pci4) ?
    [/QUOTE]
    There was the 3DMark comparisons on page 4, it goes through a couple of various PCI scenarios, but it only looks at changes in 3DMark. So there is some data.

    I’ll give you – that isn’t the same thing as gameplay. Might be neat to see how a couple of titles fair to see if there are major changes outside of a canned benchmark designed to exacerbate differences; I don’t know if it warrants going through the entire suite of tests again though.

  7. [QUOTE=”Brian_B, post: 47231, member: 96″]
    There was the 3DMark comparisons on page 4, it goes through a couple of various PCI scenarios, but it only looks at changes in 3DMark. So there is some data.

    I’ll give you – that isn’t the same thing as gameplay. Might be neat to see how a couple of titles fair to see if there are major changes outside of a canned benchmark designed to exacerbate differences; I don’t know if it warrants going through the entire suite of tests again though.
    [/QUOTE]
    Agreed… I think we can draw a relative conclusion from Brent’s data as to how gaming will be. Don’t expect to have the settings tuned up too high..

  8. Gamers Nexus was also similarly displeased. In a normal world, this would be a $100-140 card. 1080p gaming for the masses. This thing selling at retail for $250-300 should be a war crime.

  9. Very disappointing overall on the whole combination. At $200 MSRP, if even readily available for that price, with 1080p medium to low settings to make usable for current generation games does not look like a card that will stand the test of time in the end. For me, this is the worst GPU for this generation and makes even Nvidia look good, which I thought would be impossible for AMD to do (I was wrong). I do like the review overall, well done exposing the usability of this card with lower settings, FSR options and using Rebar which is readily available for usage on AMD and Intel systems. On Ebay one can buy used AMD Fury’s for less than $150 and have better performance, abeit at a higher power level and heat unless your lucky and nab a Nano for less than $150 which are 175w about cards.

  10. It’s pretty awful as far as I am concerned. It’s a card that’s basically only about as fast as a $200 card from five years ago and gimped in a way that will make it age far worse. It’s already being scalped for $500+ on eBay. Meanwhile, on eBay there are cheaper alternatives on the used market that will provide a vastly superior gaming experience. Even at $200 I think it’s a non-starter given it’s essentially a waste of engineering time and effort that does nothing to advance the already stagnant segment.

    Frankly, I see absolutely no reason for this GPU to exist in its current form.

    That’s my $0.02 anyway.

  11. This I find amusing.

    [URL unfurl=”true”]https://wccftech.com/amd-removes-4gb-vram-is-not-enough-for-games-marketing-hours-before-radeon-rx-6500-xt-4-gb-launch/[/URL]

  12. [QUOTE=”Dogsofjune, post: 47272, member: 168″]
    This I find amusing.

    [URL unfurl=”true”]https://wccftech.com/amd-removes-4gb-vram-is-not-enough-for-games-marketing-hours-before-radeon-rx-6500-xt-4-gb-launch/[/URL]
    [/QUOTE]
    I think this is a no-win situation for AMD really.

    Had they put more than 4G of VRAM on here, it would just be mining fodder – having only 4G is the only chance it had of staying remotely available. It’s not like the “hash limiter” nVidia has out there does anything to deter miners or make the cards any more available.

    I rail against nVidia for their hash limiter being just a paper shield, and for putting out old video cards again. That doesn’t really fix anything. And I’ll say the same thing here – as Dan_D says, this isn’t any better than just re-releasing a RX580, cutting the VRAM capacity, but leaving the price the same. But availability would be a step in the right direction.

    I don’t know what the right answer is to be truthful. But I’ve seen an awful lot of what I’m pretty sure are the wrong answers.

  13. Ethereum is memory bandwidth sensitive, the 6500XT measly 64bit memory bus probably makes it a worthless mining even at 8gb onboard memory. The 6500XT I would say has no reason to exist, it does everything poorly.

  14. [QUOTE=”Dan_D, post: 47270, member: 6″]
    Frankly, I see absolutely no reason for this GPU to exist in its current form.

    [/QUOTE]
    The reason is easy. $$$$. Nvidia did almost the same thing with the RTX 3050, it’s a gimp and not worth the cash either.

  15. So… if you consider the current street prices (and forget history) for all currently produced GPUs, the 6500 XT family is priced for its performance right where it should be, assuming that street price ends up in the $200-300 range.

    Going by a browsing of sold ebay items, RTX 3060’s are selling for $700-800, RX 6600 XTs are selling for about $600 and the 6600 non-XT are selling for $600. Of course, the RTX 3050 is TBD with respect to street pricing. Therefore, in theory, if the street price even settles at $300-350, you’re still paying HALF what you would for the next step up, which gives you a choice – Am I OK with low-medium 1080p settings in current AAA titles and high settings for older/low graphics intensive games? Or should I spend twice as much for a better 1080p experience?

    If you go by historic prices and market placement, this is a low $100 MSRP card (and I could go on and on about where all of these cards _should_ be priced in the current reality). Unfortunately, this is the reality we’ve been dealing with for a couple of years and I don’t see that changing anytime soon unless maybe the Eth price keeps cratering and stays cratered like it has been this week.

  16. [QUOTE=”David_Schroth, post: 47323, member: 1″]
    So… if you consider the current street prices
    [/QUOTE]
    This is the biggest party pooping post I’ve seen in a long time It’s also 100% true, which makes it all the more depressing.

    This post makes me want to go start drinking at 9am.

  17. And there are no where to be found, fail of all fails. The cheapest, easiest stripped down card ever produced for this generation and AMD cannot make them or AIBs in sufficient numbers for the desperate of desparates of the PC gamers, strapped for some kind of solution. AMD cannot manufacture or managed to get manufacture their products -> pretty big red flag. I hope Intel with their actual manufacturing capability can come through. Few more years of this, I see the killing of PC gaming, PCVR etc. Shift to services, as in computing/gaming just shifting to cloud based services and specialty hardware like game consoles and more capable TVs making PCs much less relevant. Where your processing needs are fully fulfilled and controlled by big tech companies, as long as you go along with the party line and not get kicked off.

  18. [QUOTE=”noko, post: 47328, member: 69″]
    I hope Intel with their actual manufacturing capability can come through.
    [/QUOTE]
    They’re stealing it from AMD at TSMC :ROFLMAO:

    Still, Intel is probably the only party that can do anything about prices – assuming they bring real availability, they’re probably going to want to buy market share by sacrificing margins for their first few ‘enthusiast’ generation releases, and that’s going to put downward pressure on Nvidia and AMD.

    Supposing Intel does that, I don’t expect them to continue forever as they do very much like their margins, but they may very well help reset prices somewhat.

  19. [QUOTE=”LazyGamer, post: 47333, member: 1367″]
    They’re stealing it from AMD at TSMC :ROFLMAO:

    Still, Intel is probably the only party that can do anything about prices – assuming they bring real availability, they’re probably going to want to buy market share by sacrificing margins for their first few ‘enthusiast’ generation releases, and that’s going to put downward pressure on Nvidia and AMD.

    Supposing Intel does that, I don’t expect them to continue forever as they do very much like their margins, but they may very well help reset prices somewhat.
    [/QUOTE]
    lol, yep, another TSMC product waiting in line. I would expect Intel to concentrate on OEM builds for ARC first while retail gets the scraps. They have a number of upcoming versions, if any will be on Intel process besides the iGPU versions I do not know. I would hope Intel can take control of their manufacturing process for these, maybe TSMC is just that much better process wise to give a chance for Intel to compete in GPUs.

  20. [QUOTE=”noko, post: 47334, member: 69″]
    lol, yep, another TSMC product waiting in line. I would expect Intel to concentrate on OEM builds for ARC first while retail gets the scraps.
    [/QUOTE]
    I read this sentence with a touch of sadness – I don’t want it to be true, but I have to admit that it comes across as entirely plausible.
    [QUOTE=”noko, post: 47334, member: 69″]
    They have a number of upcoming versions, if any will be on Intel process besides the iGPU versions I do not know. I would hope Intel can take control of their manufacturing process for these, maybe TSMC is just that much better process wise to give a chance for Intel to compete in GPUs.
    [/QUOTE]
    It’s not that Intel can’t make the discrete GPUs that they’re farming to TSMC, but rather that while Intel is upgrading their own fabs to catch back up in CPU production, TSMC has the better fabrication technology for Intel’s GPUs. It probably helps that TSMC has had decades of experience fabbing ATi / Nvidia / AMD GPUs as well.

    Whether Intel will dedicate fab capacity in the future toward producing their own discrete GPUs is anyone’s guess. I’d think that they’d want to, but they may not know themselves whether that will be more cost efficient than using TSMC or other third-party fabs.

  21. [QUOTE=”LazyGamer, post: 47333, member: 1367″]
    They’re stealing it from AMD at TSMC :ROFLMAO:

    Still, Intel is probably the only party that can do anything about prices – assuming they bring real availability, they’re probably going to want to buy market share by sacrificing margins for their first few ‘enthusiast’ generation releases, and that’s going to put downward pressure on Nvidia and AMD.

    Supposing Intel does that, I don’t expect them to continue forever as they do very much like their margins, but they may very well help reset prices somewhat.
    [/QUOTE]
    I don’t think Intel will be our white knight here – so far they have signaled their GPUs will be open for business for mining. Which means just more blood for the blood god.

  22. [QUOTE=”Brian_B, post: 47340, member: 96″]
    I don’t think Intel will be our white knight here – so far they have signaled their GPUs will be open for business for mining. Which means just more blood for the blood god.
    [/QUOTE]
    Maybe Intel can out blood AMD and Nvidia. AMD with TSMC and Nvidia with Samsung at this point is not even coming close to satisfying the demand and also pissing many folks off. Still doubtful TSMC will do much better with Intel.

  23. [QUOTE=”noko, post: 47347, member: 69″]
    Maybe Intel can out blood AMD and Nvidia. AMD with TSMC and Nvidia with Samsung at this point is not even coming close to satisfying the demand and also pissing many folks off. Still doubtful TSMC will do much better with Intel.
    [/QUOTE]
    Nvidia’s back to TSMC for the RTX 4000-series too.

    I’m wondering when Samsung will pick up another run. They are fabbing AMD GPUs into their new Exynos SoCs, so maybe AMD is willing to take a chance on them for some parts?

  24. Intel is very serious, except in this case 2025 does not help much now

    [URL unfurl=”true”]https://time.com/6140476/intel-building-factory-ohio/[/URL]

  25. [QUOTE=”LazyGamer, post: 47348, member: 1367″]
    Nvidia’s back to TSMC for the RTX 4000-series too.

    I’m wondering when Samsung will pick up another run. They are fabbing AMD GPUs into their new Exynos SoCs, so maybe AMD is willing to take a chance on them for some parts?
    [/QUOTE]
    AMD needs to do something to grow, increasing prices and not dramatically the number of chips, hitting a wall so to speak will not go well over time. So yes, Samsung I would think could/should be used on some skews to increase supply.

  26. Well, I checked the local MC….. These nasty assed 6500’s are in stock for $199. Ugh….
    [URL unfurl=”true”]https://www.microcenter.com/product/646514/asrock-amd-radeon-rx-6500-xt-phantom-gaming-d-overclocked-dual-fan-4gb-gddr6-pcie-40-graphics-card?sku=370973&utm_source=20220121_A_eNews_Computer_Parts_R6448&utm_medium=email&utm_campaign=R6448&MccGuid=C1BA145D-CE5D-49F6-950F-AF13D4AA7FFF[/URL]

  27. [QUOTE=”David_Schroth, post: 47323, member: 1″]
    So… if you consider the current street prices (and forget history) for all currently produced GPUs, the 6500 XT family is priced for its performance right where it should be, assuming that street price ends up in the $200-300 range.

    Going by a browsing of sold ebay items, RTX 3060’s are selling for $700-800, RX 6600 XTs are selling for about $600 and the 6600 non-XT are selling for $600. Of course, the RTX 3050 is TBD with respect to street pricing. Therefore, in theory, if the street price even settles at $300-350, you’re still paying HALF what you would for the next step up, which gives you a choice – Am I OK with low-medium 1080p settings in current AAA titles and high settings for older/low graphics intensive games? Or should I spend twice as much for a better 1080p experience?

    If you go by historic prices and market placement, this is a low $100 MSRP card (and I could go on and on about where all of these cards _should_ be priced in the current reality). Unfortunately, this is the reality we’ve been dealing with for a couple of years and I don’t see that changing anytime soon unless maybe the Eth price keeps cratering and stays cratered like it has been this week.
    [/QUOTE]
    Prices aside, for the moment, just based on the hardware specifications, engineering, and what market this GPU is aimed at, it’s wholly under-engineered, with too many cuts for even a dedicated desktop PC gaming video card.

    This is backed up by the fact that we now know this was intended as a Laptop GPU, to go alongside an APU which would already have VCE capabilities, and thus only require limited output connections, and limited PCIe bandwidth, again it was meant to go alongside Rembrandt 6000 series APUs.

    They basically pushed a Laptop GPU to the extreme to make it a desktop dedicated GPU, wrapped it up in that package, and so, therefore, in that regard, it is completely behind the curve for such a role. It is now backward, with older GPUs offering features it lacks.

    This further demonstrates to me the fall of low-end GPUs on the desktop and backs up the idea that this market is fading for new GPU IPs.

Leave a comment