Conclusion

Today, AMD has launched its Radeon RX 6500 XT graphics card.  The Radeon RX 6500 XT is a graphics card that gamers especially have been waiting on for a long while now, hoping it would have been released sooner.  With rumors of the entry-level/mainstream GPU market dying off, it was unknown if new architectures would trickle down to the everyday gamer.  With video card shortages and outrageous prices, we’ve needed a more affordable video card, a video card we can all get our hands on.  The Radeon RX 6500 XT is AMD’s answer to these cries.  But is it enough?

At its heart, the AMD Radeon RX 6500 XT is based on a brand new ASIC, and AMD said they built this GPU from the ground-up, brand new with this design in mind.  It’s based on TSMC’s N6 (6nm) node, the first for a consumer GPU.  This is fantastic and allows some incredibly high clock speeds on the GPU.  This is great and gives us a glimpse into the future of what we could expect of GPUs moving forward.  Could we see them push past the 3GHz barrier?  Very much, there is that potential with new nodes like this as they mature.

Being based on the RDNA2 architecture, and TSMC N6 node, the Radeon RX 6500 XT has the markings of being a highly engineered and bleeding edge level GPU.  But there are a few design choices, which could ultimately hold it back.  First, let’s explore the actual, real-world, gaming performance that we experienced.  

Performance

In our testing today, we focused on a scale of image quality settings up and down from highest to lowest at 1080p on the GIGABYTE Radeon RX 6500 XT EAGLE 4G video card.  We wanted to see what settings would be a bottleneck on the video card in games, and what settings allow the best gameplay performance.  If you want to aim for 60fps, for example, you can look at the graphs and determine what game settings will allow that to happen.

We started off in Forza Horizon 5, without Ray Tracing and quickly found Extreme settings to be a bottleneck.  Moving down to Ultra helped, but was ultimately still too slow.  However, we got a big boost at High settings, and it was here where the video card performed well.  Compared to the Radeon RX 5500 XT though, it wasn’t much faster, just a few percent faster.  However, compared to the GeForce GTX 1650, we easily saw a 34% improvement in performance, backing up AMD’s claims.

In Far Cry 6, we found that using HD Textures was impossible, the framebuffer cannot handle it.  Even with FSR, the game was choppy with HD Textures.  In fact, even without HD Textures, the VRAM indicator in Far Cry 6 said we were over the line, all the way down till we hit the Low settings.  We did find the game choppy at Ultra, but it was better at High, and very smooth at Medium settings.  Once again Medium seemed to be the best setting overall.  Performance compared to the RX 5500 XT though was just on par, no better really.

Cyberpunk 2077, if it is any indication of future gameplay performance, showed the GIGABYTE Radeon RX 6500 XT EAGLE 4G performing worse than the SAPPHIRE PULSE Radeon RX 5500 XT.  At every quality setting, performance was slower on the newer Radeon RX 6500 XT.  It could be due to PCIe bandwidth, it could be other reasons, but whatever the reason, it’s real.  The only setting that was remotely playable was Low, but even at Low, the game was not that enjoyable performance-wise.

Godfall also seemed to choke at Ultra settings and even High settings.  Even at High, it was choppy, it wasn’t until we went down to Medium that the game was smooth.  However, you could get away with FSR at High, if you really wanted to.  In terms of performance though, once again the Radeon RX 6500 XT was not much faster than the 5500 XT, only at Medium was it different.

Horizon Zero Dawn performed rather well, but not at Ultimate, it was choppy at Ultimate settings.  Below that, however, the game performed much better.  If you want the smoothest performance, use the Original setting.  If you enable FSR you should be able to play at Favor Quality well, as long as FSR is enabled.  In terms of performance here, once again we aren’t much faster than the Radeon RX 5500 XT.

Amongst all the games, Microsoft Flight Simulator 2020 Game of the Year Edition in DX12 mode showed the most benefit with the Radeon RX 6500 XT.  We think this is mostly down to the newer RDNA2 architecture.  The older generation of video cards did struggle in this game, but RDNA2 cards do a lot better.  In that regard, the GIGABYTE Radeon RX 6500 XT EAGLE 4G is much better for this game than the Radeon RX 5500 XT.  However, you will still need to lower the game to High-End or Medium to get the best performance, Ultra will be way too choppy.

The GIGABYTE Radeon RX 6500 XT EAGLE 4G

Let’s talk a little bit about the GIGABYTE Radeon RX 6500 XT EAGLE 4G video card itself.  GIGABYTE has put together a well-made video card, it’s solid in the hand, light in weight, and very compact in size.  It should fit in small cases.  It’s also very quiet, it doesn’t make a peep, it stayed very cool while we were gaming, and doesn’t consume a lot of power. 

We were excited to see this video card sustaining 2900MHz while gaming.  The frequency was consistent and stayed that high during gaming.  This is well above the AMD quoted specs for the GPU, and gives us hope of very high overclocks, which we intended to do in the future, so stay tuned.  We are also interested to see if there is any headroom in the memory overclock since it does seem to be memory bandwidth constrained.

Overall, we think GIGABYTE did a great job engineering the video card, and we’d look forward to taking a look at the GAMING OC version to see how it differs. 


Dirty Details

While there is a lot of good going for the Radeon RX 6500 XT, there are also some very glaring and questionable choices made in the name of reducing cost.  Reducing cost is important, don’t get us wrong, in today’s market anything AMD and NVIDIA can do to reduce cost for AIBs is important.  However, sometimes you have to ask yourself if the cost savings you are making is worth the detriment to performance, features, and the end-user who is spending an inflated amount of money on the video card already.

In this industry, you get points for adding or maintaining features from generation to generation, not removing features from generation to generation.

PCI-Express

The cut down of the PCI-Express connection to a PCIe 4.0 x4 lane is one of those questionable cuts.  While this card is on a PCI-Express 4.0 (or PCIe 5.0) motherboard, like AMD X570 or Intel Z590 or Z690, it will perform well enough with a 4GB framebuffer.  The problem is, moving down to a PCI-Express 3.0 platform (which is where most people still are who are looking to upgrade) will bottleneck the video card pretty severely. 

Consider that the very GeForce GTX 1650 cards that AMD wants to replace are primarily going to be in PCI-Express 3.0 systems, putting this card in those systems to upgrade those video cards is going to be a major bottleneck for this video card.  In addition, not every new CPU or APU supports PCI-Express 4.0.  Those shiny new AMD Ryzen 5600G and 5700G APUs that were just released? Yep, those are limited to PCI-Express 3.0, no matter what chipset motherboard you have. AMD X470, or X370, or B450 or A series motherboards? Yep, those are limited to PCI-Express 3.0. It is also only recently that Intel adopted PCI-Express 4.0 with its Rocket Lake launch in early 2021.  Everything before it, PCI-Express 3.0.

In 2022 we shouldn’t be having these PCI-Express bandwidth constraints on a video card, we just shouldn’t. We are up to incredible PCI-Express bandwidth potential on motherboards now with PCIe 4.0 and now 5.0. There simply is no excuse for bottlenecking a video card by the PCI-Express bus bandwidth, there are plenty of other factors where bottlenecks should be, and the PCIe bus should not be one of them in this day and age.

Media Encoding

The other major cut down is the removal of the media encoding engine, VCE, also called Video Coding Engine.  While it can decode VP9, H.264, and H.265, it cannot decode AV1.  It also cannot encode anything in hardware, i.e., getting GPU hardware accelerated encoding of anything.  While this video card is certainly being positioned as a “gaming video card”, often it is these lower-end, lower-priced video cards that content creators flock towards. 

What has been the norm for so very long has been content creators choosing to save money, not needing a gaming video card, not needing to pay the gaming video card price tag, and getting the lower-end model to utilize the media encoding capabilities and functions.  Often these cards have been great additions for content creators to upgrade to hardware-accelerated GPU media encoding, at a more reasonable price. 

In addition, streamers today are large in numbers.  There are more and more streamers, either playing games simultaneously or even just for non-gaming purposes.  Streaming with programs like OBS can utilize the GPU to offload encoding to free up resources.  The GPU media encoder is faster than your CPU and is thus able to stream at higher bitrates and bandwidths because it’s just faster and better for the task.  However, AMD has taken this ability away from the Radeon RX 6500 XT.  You will have to rely completely on your CPU for streaming, and any content creation.  Keep in mind this was a functionality that existed in the AMD Radeon RX 5500 XT.

Framebuffer

Another potential issue is the framebuffer itself.  The AMD Radeon RX 5500 XT had two models basically, a 4GB and 8GB.  Many AIBs sold 4G and 8GB models, there were 8GB Radeon RX 5500 XT’s.  That is not so with the Radeon RX 6500 XT, 4GB only, no 8GB option.

Display Ports

Finally, the last dirty little detail is that the physical display output ports have been reduced.  On the Radeon RX 6500 XT, you will find only 1 DisplayPort and only 1 HDMI port.  Indeed, our GIGABYTE RX 6500 XT EAGLE only had one DP and one HDMI.  This is half of what the Radeon RX 5500 XT offered.  Our SAPPHIRE PULSE RX 5500 XT has three DisplayPorts and one HDMI, a total of 4 ports versus 2 on the 6500 XT.

Summary

The AMD Radeon RX 6500 XT is an interesting video card.  Overall, our testing of performance wasn’t that bad, at the right settings.  Some games were slower than the Radeon RX 5500 XT, some games were the same, and some games were faster.  It was definitely faster than the GeForce GTX 1650 and would constitute an upgrade.  We validated and backup AMD’s claims in regards to its performance versus the GTX 1650. 

In regards to comparing it against the Radeon RX 5500 XT, it seemed to jump back and forth, depending on the game. It wasn’t a clear win, but neither was it a clear loss, except in Cyberpunk. Some of that performance may be down to the PCI-Express bandwidth, and or that and a combination of the 64-bit memory bus and how that interacts with the Infinity Cache in some games on the RX 6500 XT. 

Once the Infinity Cache fills up, you are down to swapping out on a 144GB/s memory bus, and then over a slower PCIe connection, with a small 4GB framebuffer.  This can cause inconsistency in games; they will behave differently.  Some games will be more detrimental to performance, and others won’t mind at all and still perform well.  It depends on the game.  Unfortunately, we don’t know what the future has in store, but we can assume that games will keep utilizing and needing more VRAM, even at 1080p. That constrained PCIe bus is going to play a bigger part as we move forward in time with games.

Overall, this video card we found is best suited for Medium gameplay settings at 1080p in current games.  If a game supports FSR, then you might be able to play at High.   For sure Ultra and Ultimate settings are out of the question.  In the more intensive games, you are likely to find yourself in the Medium to Low gaming setting category.  Even in Cyberpunk, Low wasn’t low enough though.  Future games will only get more demanding. 

We didn’t do any in-depth Ray Tracing testing, frankly, that feature is just there to check a box.  It is not practically playable with Ray Tracing.  You can turn it on, see what it looks like, and then you’ll end up turning it off so you can get higher performance and a smoother experience.

Final Points

The AMD Radeon RX 6500 XT has limitations and bottlenecks that may put a damper on it in the long run. The PCI-Express bus, the Media Encoder, the Framebuffer options, and the Display Ports are areas where we think some compromises could have been made between saving cost, and features and performance. Giving us an 8GB framebuffer or option might mitigate the PCIe issues most of us have. Did the Media Encoder really need to be pulled?

Those two things alone would have eased up on tensions people are feeling right now. But we can’t go back to what isn’t, we have to look at what is front of us right now. This is a card for those that really need a new card, have no other options, and just want something that’s available and at a decent price in today’s market. Just know it’s a very focused card, gaming-oriented, but “Medium-to-Low” 1080p gaming. It’ll get you out of a bind, but its longevity is questioned.

The GIGABYTE Radeon RX 6500 XT EAGLE 4G is a fine basic, no-frills, video card, if you are looking for a Radeon RX 6500 XT it should be on the lower end of pricing, and hopefully more affordable and available for you.  We won’t even get into pricing, it is what it is, and the only saving grace of the Radeon RX 6500 XT is that it actually be available, in stock, and as close as it can be to MSRP.

Discussion

Don’t Miss Out on More FPS Review Content!

Our weekly newsletter includes a recap of our reviews and a run down of the most popular tech news that we published.

Brent Justice

Brent Justice has been reviewing computer components for 20+ years, educated in the art and method of the computer hardware review he brings experience, knowledge, and hands-on testing with a gamer-oriented...

Join the Conversation

29 Comments

  1. Nice that you got a review sample – that bodes well for availability.

    It may have drawbacks, but at this point I’d say anything that can stay on the shelves for near MSRP is going to be a huge improvement.

  2. More like 150% of MSRP for the out of stock listings I’ve seen – but that’s still an improvement! Somehow!

    I also appreciate the testing of the transcoding… err, ‘limited decoding’ block. I generally disapprove of Intel’s IGP-less ‘F’ SKUs because they cut off Quicksync along with the IGP, as well as AMDs release of mostly GPU-less desktop parts.

    I’m not about to spit on working, [I]available[/I] GPUs, but I think AMD cut just a few too many corners here – and they still didn’t get it down to the sub-75W level of power draw needed for powering cards directly through the PCIe slot.

  3. Wow, that was a little worse than I expected, especially considering it isn’t selling for $199 hardly anywhere.

  4. [QUOTE=”LazyGamer, post: 47171, member: 1367″]
    I’m not about to spit on working, [I]available[/I] GPUs, but I think AMD cut just a few too many corners here – and they still didn’t get it down to the sub-75W level of power draw needed for powering cards directly through the PCIe slot.
    [/QUOTE]

    [QUOTE=”Dogsofjune, post: 47182, member: 168″]
    Wow, that was a little worse than I expected, especially considering it isn’t selling for $199 hardly anywhere.
    [/QUOTE]

    Yeah, I think in “Days of Yore” this would have been the bottom tier GPU and gone in the $100-$120 budget range. But today – yeah, anything is better than nothing and the only real competition is whatever you can find scalped on Ebay.

  5. It is nice to see that you tested using FSR where the option was available. This is very useful, keeping in mind it is similar to about to be released RSR

    Do you plan on a separate testing for pci3 vs pci4 (idea is to see how the playable settings get reduced in pci3 vs pci4) ?

  6. [QUOTE=”Marees, post: 47229, member: 1536″]
    Do you plan on a separate testing for pci3 vs pci4 (idea is to see how the playable settings get reduced in pci3 vs pci4) ?
    [/QUOTE]
    There was the 3DMark comparisons on page 4, it goes through a couple of various PCI scenarios, but it only looks at changes in 3DMark. So there is some data.

    I’ll give you – that isn’t the same thing as gameplay. Might be neat to see how a couple of titles fair to see if there are major changes outside of a canned benchmark designed to exacerbate differences; I don’t know if it warrants going through the entire suite of tests again though.

  7. [QUOTE=”Brian_B, post: 47231, member: 96″]
    There was the 3DMark comparisons on page 4, it goes through a couple of various PCI scenarios, but it only looks at changes in 3DMark. So there is some data.

    I’ll give you – that isn’t the same thing as gameplay. Might be neat to see how a couple of titles fair to see if there are major changes outside of a canned benchmark designed to exacerbate differences; I don’t know if it warrants going through the entire suite of tests again though.
    [/QUOTE]
    Agreed… I think we can draw a relative conclusion from Brent’s data as to how gaming will be. Don’t expect to have the settings tuned up too high..

  8. Gamers Nexus was also similarly displeased. In a normal world, this would be a $100-140 card. 1080p gaming for the masses. This thing selling at retail for $250-300 should be a war crime.

  9. Very disappointing overall on the whole combination. At $200 MSRP, if even readily available for that price, with 1080p medium to low settings to make usable for current generation games does not look like a card that will stand the test of time in the end. For me, this is the worst GPU for this generation and makes even Nvidia look good, which I thought would be impossible for AMD to do (I was wrong). I do like the review overall, well done exposing the usability of this card with lower settings, FSR options and using Rebar which is readily available for usage on AMD and Intel systems. On Ebay one can buy used AMD Fury’s for less than $150 and have better performance, abeit at a higher power level and heat unless your lucky and nab a Nano for less than $150 which are 175w about cards.

  10. It’s pretty awful as far as I am concerned. It’s a card that’s basically only about as fast as a $200 card from five years ago and gimped in a way that will make it age far worse. It’s already being scalped for $500+ on eBay. Meanwhile, on eBay there are cheaper alternatives on the used market that will provide a vastly superior gaming experience. Even at $200 I think it’s a non-starter given it’s essentially a waste of engineering time and effort that does nothing to advance the already stagnant segment.

    Frankly, I see absolutely no reason for this GPU to exist in its current form.

    That’s my $0.02 anyway.

  11. This I find amusing.

    [URL unfurl=”true”]https://wccftech.com/amd-removes-4gb-vram-is-not-enough-for-games-marketing-hours-before-radeon-rx-6500-xt-4-gb-launch/[/URL]

  12. [QUOTE=”Dogsofjune, post: 47272, member: 168″]
    This I find amusing.

    [URL unfurl=”true”]https://wccftech.com/amd-removes-4gb-vram-is-not-enough-for-games-marketing-hours-before-radeon-rx-6500-xt-4-gb-launch/[/URL]
    [/QUOTE]
    I think this is a no-win situation for AMD really.

    Had they put more than 4G of VRAM on here, it would just be mining fodder – having only 4G is the only chance it had of staying remotely available. It’s not like the “hash limiter” nVidia has out there does anything to deter miners or make the cards any more available.

    I rail against nVidia for their hash limiter being just a paper shield, and for putting out old video cards again. That doesn’t really fix anything. And I’ll say the same thing here – as Dan_D says, this isn’t any better than just re-releasing a RX580, cutting the VRAM capacity, but leaving the price the same. But availability would be a step in the right direction.

    I don’t know what the right answer is to be truthful. But I’ve seen an awful lot of what I’m pretty sure are the wrong answers.

  13. Ethereum is memory bandwidth sensitive, the 6500XT measly 64bit memory bus probably makes it a worthless mining even at 8gb onboard memory. The 6500XT I would say has no reason to exist, it does everything poorly.

  14. [QUOTE=”Dan_D, post: 47270, member: 6″]
    Frankly, I see absolutely no reason for this GPU to exist in its current form.

    [/QUOTE]
    The reason is easy. $$$$. Nvidia did almost the same thing with the RTX 3050, it’s a gimp and not worth the cash either.

  15. So… if you consider the current street prices (and forget history) for all currently produced GPUs, the 6500 XT family is priced for its performance right where it should be, assuming that street price ends up in the $200-300 range.

    Going by a browsing of sold ebay items, RTX 3060’s are selling for $700-800, RX 6600 XTs are selling for about $600 and the 6600 non-XT are selling for $600. Of course, the RTX 3050 is TBD with respect to street pricing. Therefore, in theory, if the street price even settles at $300-350, you’re still paying HALF what you would for the next step up, which gives you a choice – Am I OK with low-medium 1080p settings in current AAA titles and high settings for older/low graphics intensive games? Or should I spend twice as much for a better 1080p experience?

    If you go by historic prices and market placement, this is a low $100 MSRP card (and I could go on and on about where all of these cards _should_ be priced in the current reality). Unfortunately, this is the reality we’ve been dealing with for a couple of years and I don’t see that changing anytime soon unless maybe the Eth price keeps cratering and stays cratered like it has been this week.

  16. [QUOTE=”David_Schroth, post: 47323, member: 1″]
    So… if you consider the current street prices
    [/QUOTE]
    This is the biggest party pooping post I’ve seen in a long time It’s also 100% true, which makes it all the more depressing.

    This post makes me want to go start drinking at 9am.

  17. And there are no where to be found, fail of all fails. The cheapest, easiest stripped down card ever produced for this generation and AMD cannot make them or AIBs in sufficient numbers for the desperate of desparates of the PC gamers, strapped for some kind of solution. AMD cannot manufacture or managed to get manufacture their products -> pretty big red flag. I hope Intel with their actual manufacturing capability can come through. Few more years of this, I see the killing of PC gaming, PCVR etc. Shift to services, as in computing/gaming just shifting to cloud based services and specialty hardware like game consoles and more capable TVs making PCs much less relevant. Where your processing needs are fully fulfilled and controlled by big tech companies, as long as you go along with the party line and not get kicked off.

  18. [QUOTE=”noko, post: 47328, member: 69″]
    I hope Intel with their actual manufacturing capability can come through.
    [/QUOTE]
    They’re stealing it from AMD at TSMC :ROFLMAO:

    Still, Intel is probably the only party that can do anything about prices – assuming they bring real availability, they’re probably going to want to buy market share by sacrificing margins for their first few ‘enthusiast’ generation releases, and that’s going to put downward pressure on Nvidia and AMD.

    Supposing Intel does that, I don’t expect them to continue forever as they do very much like their margins, but they may very well help reset prices somewhat.

  19. [QUOTE=”LazyGamer, post: 47333, member: 1367″]
    They’re stealing it from AMD at TSMC :ROFLMAO:

    Still, Intel is probably the only party that can do anything about prices – assuming they bring real availability, they’re probably going to want to buy market share by sacrificing margins for their first few ‘enthusiast’ generation releases, and that’s going to put downward pressure on Nvidia and AMD.

    Supposing Intel does that, I don’t expect them to continue forever as they do very much like their margins, but they may very well help reset prices somewhat.
    [/QUOTE]
    lol, yep, another TSMC product waiting in line. I would expect Intel to concentrate on OEM builds for ARC first while retail gets the scraps. They have a number of upcoming versions, if any will be on Intel process besides the iGPU versions I do not know. I would hope Intel can take control of their manufacturing process for these, maybe TSMC is just that much better process wise to give a chance for Intel to compete in GPUs.

  20. [QUOTE=”noko, post: 47334, member: 69″]
    lol, yep, another TSMC product waiting in line. I would expect Intel to concentrate on OEM builds for ARC first while retail gets the scraps.
    [/QUOTE]
    I read this sentence with a touch of sadness – I don’t want it to be true, but I have to admit that it comes across as entirely plausible.
    [QUOTE=”noko, post: 47334, member: 69″]
    They have a number of upcoming versions, if any will be on Intel process besides the iGPU versions I do not know. I would hope Intel can take control of their manufacturing process for these, maybe TSMC is just that much better process wise to give a chance for Intel to compete in GPUs.
    [/QUOTE]
    It’s not that Intel can’t make the discrete GPUs that they’re farming to TSMC, but rather that while Intel is upgrading their own fabs to catch back up in CPU production, TSMC has the better fabrication technology for Intel’s GPUs. It probably helps that TSMC has had decades of experience fabbing ATi / Nvidia / AMD GPUs as well.

    Whether Intel will dedicate fab capacity in the future toward producing their own discrete GPUs is anyone’s guess. I’d think that they’d want to, but they may not know themselves whether that will be more cost efficient than using TSMC or other third-party fabs.

  21. [QUOTE=”LazyGamer, post: 47333, member: 1367″]
    They’re stealing it from AMD at TSMC :ROFLMAO:

    Still, Intel is probably the only party that can do anything about prices – assuming they bring real availability, they’re probably going to want to buy market share by sacrificing margins for their first few ‘enthusiast’ generation releases, and that’s going to put downward pressure on Nvidia and AMD.

    Supposing Intel does that, I don’t expect them to continue forever as they do very much like their margins, but they may very well help reset prices somewhat.
    [/QUOTE]
    I don’t think Intel will be our white knight here – so far they have signaled their GPUs will be open for business for mining. Which means just more blood for the blood god.

  22. [QUOTE=”Brian_B, post: 47340, member: 96″]
    I don’t think Intel will be our white knight here – so far they have signaled their GPUs will be open for business for mining. Which means just more blood for the blood god.
    [/QUOTE]
    Maybe Intel can out blood AMD and Nvidia. AMD with TSMC and Nvidia with Samsung at this point is not even coming close to satisfying the demand and also pissing many folks off. Still doubtful TSMC will do much better with Intel.

  23. [QUOTE=”noko, post: 47347, member: 69″]
    Maybe Intel can out blood AMD and Nvidia. AMD with TSMC and Nvidia with Samsung at this point is not even coming close to satisfying the demand and also pissing many folks off. Still doubtful TSMC will do much better with Intel.
    [/QUOTE]
    Nvidia’s back to TSMC for the RTX 4000-series too.

    I’m wondering when Samsung will pick up another run. They are fabbing AMD GPUs into their new Exynos SoCs, so maybe AMD is willing to take a chance on them for some parts?

  24. Intel is very serious, except in this case 2025 does not help much now

    [URL unfurl=”true”]https://time.com/6140476/intel-building-factory-ohio/[/URL]

  25. [QUOTE=”LazyGamer, post: 47348, member: 1367″]
    Nvidia’s back to TSMC for the RTX 4000-series too.

    I’m wondering when Samsung will pick up another run. They are fabbing AMD GPUs into their new Exynos SoCs, so maybe AMD is willing to take a chance on them for some parts?
    [/QUOTE]
    AMD needs to do something to grow, increasing prices and not dramatically the number of chips, hitting a wall so to speak will not go well over time. So yes, Samsung I would think could/should be used on some skews to increase supply.

  26. Well, I checked the local MC….. These nasty assed 6500’s are in stock for $199. Ugh….
    [URL unfurl=”true”]https://www.microcenter.com/product/646514/asrock-amd-radeon-rx-6500-xt-phantom-gaming-d-overclocked-dual-fan-4gb-gddr6-pcie-40-graphics-card?sku=370973&utm_source=20220121_A_eNews_Computer_Parts_R6448&utm_medium=email&utm_campaign=R6448&MccGuid=C1BA145D-CE5D-49F6-950F-AF13D4AA7FFF[/URL]

  27. [QUOTE=”David_Schroth, post: 47323, member: 1″]
    So… if you consider the current street prices (and forget history) for all currently produced GPUs, the 6500 XT family is priced for its performance right where it should be, assuming that street price ends up in the $200-300 range.

    Going by a browsing of sold ebay items, RTX 3060’s are selling for $700-800, RX 6600 XTs are selling for about $600 and the 6600 non-XT are selling for $600. Of course, the RTX 3050 is TBD with respect to street pricing. Therefore, in theory, if the street price even settles at $300-350, you’re still paying HALF what you would for the next step up, which gives you a choice – Am I OK with low-medium 1080p settings in current AAA titles and high settings for older/low graphics intensive games? Or should I spend twice as much for a better 1080p experience?

    If you go by historic prices and market placement, this is a low $100 MSRP card (and I could go on and on about where all of these cards _should_ be priced in the current reality). Unfortunately, this is the reality we’ve been dealing with for a couple of years and I don’t see that changing anytime soon unless maybe the Eth price keeps cratering and stays cratered like it has been this week.
    [/QUOTE]
    Prices aside, for the moment, just based on the hardware specifications, engineering, and what market this GPU is aimed at, it’s wholly under-engineered, with too many cuts for even a dedicated desktop PC gaming video card.

    This is backed up by the fact that we now know this was intended as a Laptop GPU, to go alongside an APU which would already have VCE capabilities, and thus only require limited output connections, and limited PCIe bandwidth, again it was meant to go alongside Rembrandt 6000 series APUs.

    They basically pushed a Laptop GPU to the extreme to make it a desktop dedicated GPU, wrapped it up in that package, and so, therefore, in that regard, it is completely behind the curve for such a role. It is now backward, with older GPUs offering features it lacks.

    This further demonstrates to me the fall of low-end GPUs on the desktop and backs up the idea that this market is fading for new GPU IPs.

Leave a comment