Image: AMD

TechSpot has shared new benchmarks that suggest the AMD Radeon RX 6500 XT could be a disappointment for gamers who will be running the graphics card in PCIe 3.0 mode.

While third-party benchmarks for the Radeon RX 6500 XT will not be shared until later this week, TechSpot has teased the potential performance of AMD’s new $200 budget option by benchmarking its predecessor, the Radeon RX 5500 XT, under various PCIe bandwidth configurations. Assuming that the cards are truly as similar as TechSpot believes, Radeon RX 6500 XT users may see performance decreases of as much as 43% (1% min FPS) in select games such as Shadow of the Tomb Raider when the card is run under PCIe 3.0 mode (x4, 4 GB/s vs. 8 GB/s).

The benchmarks suggest that AMD could have avoided this problem partially by matching the memory capacity of the Radeon RX 6500 XT with that of the previous model. Performance degradation seems to be considerably less in the metrics derived from the Radeon RX 5500 XT with 8 GB of memory versus the 4 GB version.

Image: TechSpot

“[…] limiting the 4GB model to even PCIe 4.0 x4 heavily reduced performance, suggesting that out of the box the 6500 XT could be in many instances limited primarily by the PCIe connection, which is pretty shocking,” concluded TechSpot. “It also strongly suggests that installing the 6500 XT into a system that only supports PCI Express 3.0 could in many instances be devastating to performance.”

“At this point we feel all reviewers should be mindful of this and make sure to test the 6500 XT in PCIe 3.0 mode. There’s no excuse not to do this as you can simply toggle between 3.0 and 4.0 in the BIOS. Of course, AMD is hoping reviewers overlook this and with most now testing on PCIe 4.0 systems, the 6500 XT might end up looking a lot better than it’s really going to be for users.”

Announced earlier this month, the Radeon RX 6500 XT is a graphics card with 1024 cores, 4 GB of memory, a boost clock of up to 2815 MHz, and TBP of 107 watts. The GPU enables up to 35 percent faster gaming performance on average in 1080p under high settings compared to the competition’s offerings, according to AMD.

Source: TechSpot

Don’t Miss Out on More FPS Review Content!

Our weekly newsletter includes a recap of our reviews and a run down of the most popular tech news that we published.

Join the Conversation

13 Comments

  1. This sounds like a driver issue moreso than a real bandwidth issue. Seems like there was something similar earlier with an AMD card and it ended up being fixed up.

    1. This is an ‘old’ card that was benchmarked.

      So unless AMD fixed it and then broke it again after the 5500XT was released, I can’t see this as being a driver issue.

      The results of the 5500XT are similar to launch.

  2. So what’s the theory here, that it needs to load textures from RAM mid game, and this is what is resulting in the 1%fps drops?

    Sounds possible I guess, but it looks like he is only running it in x8 mode.

    Why would you ever do that? If – as he concludes – x8 gen 4 is fine, 16x Gen 3 should be just as fine.

    No one should be running a GPU under 16x.

    That, and I suspect this might be a corner case. For the overwhelming majority of titles out there 4GB should be more than enough for 1080p for now, and by the time it isn’t, this GPU probably won’t be fast enough to run those titles anyway.

    It almost looks like he is intentionally sabotaging the card by running it in x8 mode in order to try to make a point.

  3. [QUOTE=”Zarathustra, post: 47027, member: 203″]
    So what’s the theory here, that it needs to load textures from RAM mid game, and this is what is resulting in the 1%fps drops?

    Sounds possible I guess, but it looks like he is only running it in x8 mode.

    Why would you ever do that? If – as he concludes – x8 gen 4 is fine, 16x Gen 3 should be just as fine.

    No one should be running a GPU under 16x.

    That, and I suspect this might be a corner case. For the overwhelming majority of titles out there 4GB should be more than enough for 1080p for now, and by the time it isn’t, this GPU probably won’t be fast enough to run those titles anyway.

    It almost looks like he is intentionally sabotaging the card by running it in x8 mode in order to try to make a point.
    [/QUOTE]
    The rumor is that 6500 xt is pcie 4.0 x4. So at maximum it has same bandwidth as pcie 3.0 x8. Worst case scenario someone puts it into a pcie 3.0 slot.

  4. [QUOTE=”serpretetsky, post: 47028, member: 4634″]
    The rumor is that 6500 xt is pcie 4.0 x4. So at maximum it has same bandwidth as pcie 3.0 x8. Worst case scenario someone puts it into a pcie 3.0 slot.
    [/QUOTE]
    TechPowerUp is saying PCI 4.0 x8 – that may not be accurate tho

    [URL unfurl=”true”]https://www.techpowerup.com/gpu-specs/radeon-rx-6500-xt.c3850[/URL]

  5. My video card might get more FPS on a wider buss running st a faster clock rate. And if I run it on a smaller buss at a lower clock rate the performance will suffer.

    I mean the guy isn’t wrong as long as the larger faster buss is supported by card and system. By and large these cards may be refreshing Systems with only pcie 3.x slots so this shit won’t even be a theoretical.

    Man my card would be faster If it was on a better BUS. Yea and my engine would be faster if my intake. Fuel delivery, and exhaust were better.

    I just don’t get it. Maybe some real world comparison for motherboards with like cpus on pcie 4 and pcie3 mode to see if this makes a difference on lower tier cards.

  6. [MEDIA=youtube]ZFpuJqx9Qmw[/MEDIA]

    According to this, it only uses 4 lanes. When you stick this card on PCIe 3.0, it gets crippled. In many cases the RX 580 is coming out ahead. I think the RX 580 also has more vRAM. There should never be situations where a low-end card from today is beaten by a low-end card from 5 years ago (and lets not forget that the RX 580 is a refresh of the RX 480). Good gawd. Well, at least it beats the GTX 970 (barely), a mid-range card from almost 8 years ago. Not in GTA V though.

    1440p results just make those older cards like the RX 580 and GTX 970 look even better than the 6500 XT, sheesh. Although I don’t know why you would have cards like these and try to run games at 1440p, but interesting to see for testing purposes.

    Well, at least the 6500 XT used the least electricity.

    Who decided to let this card come to market like this? What a waste of a product. I don’t understand how this card was released for sale. This doesn’t seem like a real product. More like an April Fool’s joke.

  7. Title is written backwards.
    It’s not a disaster for PCI 3 systems – they continue to work fine as they have for years.
    It’s that the 6500 is a disaster of a card.

  8. [QUOTE=”DrezKill, post: 47305, member: 230″]
    1440p results just make those older cards like the RX 580 and GTX 970 look even better than the 6500 XT, sheesh. Although I don’t know why you would have cards like these and try to run games at 1440p, but interesting to see for testing purposes.
    [/QUOTE]
    I upgraded to a GTX970 while playing at 1600p… but that was back when they were new! Even got a second for SLi, back when that was a thing 🙂

    I did play through The Outer Worlds at 1440p on one of those GTX970s, and the experience was barely passable, regularly dropping to just below 30FPS. I think that I made it work due to my own stubbornness more than anything else.

    [QUOTE=”DrezKill, post: 47305, member: 230″]
    Who decided to let this card come to market like this? What a waste of a product. I don’t understand how this card was released for sale. This doesn’t seem like a real product. More like an April Fool’s joke.
    [/QUOTE]
    So, it’s basically a mobile silicon spin that’s been dropped into a desktop form-factor. It’s entirely stripped down for that purpose, where PCIe 4.0 x4 is apparently the norm.

  9. [QUOTE=”LazyGamer, post: 47341, member: 1367″]
    So, it’s basically a mobile silicon spin that’s been dropped into a desktop form-factor. It’s entirely stripped down for that purpose, where PCIe 4.0 x4 is apparently the norm.
    [/QUOTE]

    Interesting.

    In my old Latitude E6540 with an i7-4810mq and a Radeon HD 8790M, the Radeon got 8 lanes, but I guess on newer machines where gen 4 PCIe is available, they have justified dropping that down to 4x.

    In a laptop this isn’t going to be a problem. The GPU will be soldered to the motherboard, so you won’t have people sticking it in a gen 3 board, but in the desktop form factor, this is a risk.

  10. What does a 580 cost these days and where could you buy it? This would be a terrible launch if you could just buy any model of card you wanted. As far as I can tell, the real competition is:

    1650: $319
    1050ti: $299
    6500XT: $259
    RX 550: $229

    Given the real world availability, the 6500XT isn’t…. terrible?

    In all honesty, the name is the real issue. If they called it the 6400XT, for example, people probably wouldn’t be complaining as much.

  11. [QUOTE=”Brian_B, post: 47320, member: 96″]
    Title is written backwards.
    It’s not a disaster for PCI 3 systems – they continue to work fine as they have for years.
    It’s that the 6500 is a disaster of a card.
    [/QUOTE]

    I think it is a fine card, provided you use it in a gen 4 system.

    They named it incorrectly though. It should have been called the 6400 or something lower.

    Price/performance in this market is actually fairly decent for a budget card for 1080p use.

  12. [QUOTE=”LazyGamer, post: 47341, member: 1367″]
    I upgraded to a GTX970 while playing at 1600p… but that was back when they were new! Even got a second for SLi, back when that was a thing 🙂

    I did play through The Outer Worlds at 1440p on one of those GTX970s, and the experience was barely passable, regularly dropping to just below 30FPS. I think that I made it work due to my own stubbornness more than anything else.
    [/QUOTE]
    I got a GTX 970 at launch, and I used it for 1200p gaming. I didn’t upgrade to 1440p until after I got 1080 Ti 5 years later (and that was only cuz my 1200p monitor died after 9 years).

    [QUOTE=”LazyGamer, post: 47341, member: 1367″]
    So, it’s basically a mobile silicon spin that’s been dropped into a desktop form-factor. It’s entirely stripped down for that purpose, where PCIe 4.0 x4 is apparently the norm.
    [/QUOTE]
    Oh well dang.

    [QUOTE=”Zarathustra, post: 47343, member: 203″]
    In a laptop this isn’t going to be a problem. The GPU will be soldered to the motherboard, so you won’t have people sticking it in a gen 3 board, but in the desktop form factor, this is a risk.
    [/QUOTE]
    Indeed.

    [QUOTE=”Endgame, post: 47344, member: 1041″]
    If they called it the 6400XT, for example, people probably wouldn’t be complaining as much.
    [/QUOTE]
    [QUOTE=”Zarathustra, post: 47345, member: 203″]
    They named it incorrectly though. It should have been called the 6400 or something lower.
    [/QUOTE]
    Yeah I agree.

Leave a comment