Image: AMD

Frank Azor, Chief Architect of Gaming Solutions & Marketing at AMD, has tweeted an image that suggests red team’s family of Radeon RX 6000 Series graphics cards offers better performance per dollar than NVIDIA’s competing GeForce products for gamers. Among the comparisons listed in the chart shared today is AMD’s new Radeon RX 6950 XT ($1,110) flagship, which is depicted as offering 80% more FPS per dollar than the GeForce RTX 3090 ($1,700) based on Newegg’s lowest prices as of May 10, 2022. AMD is also claiming that its most powerful RDNA 2 option for desktop gamers offers 22% better FPS per watt than the non-Titanium version of green team’s Ampere flagship. The comparisons come just a week after the launch of the Radeon RX 6950 XT, Radeon RX 6750 XT, and the Radeon RX 6650 XT, which arrived on shelves last Tuesday, May 10.

Image: AMD

As a longtime gamer I’m grateful for the renewed competition in high-end graphics, we all win from it. As an @AMD employee I’m super proud of what our @Radeon team has accomplished.

With a 2.1GHz Game Clock coupled with 16GB of high-speed GDDR6 memory, the AMD Radeon RX 6950 XT graphics card delivers incredible performance and breathtaking visuals for the most demanding AAA and esports titles at 4K resolution with max settings. The AMD Radeon RX 6750 XT graphics card offers a cutting-edge, high-performance gaming experience at 1440p resolution with max settings, while the AMD Radeon RX 6650 XT graphics card offers ultra-smooth, high-refresh rate 1080p gaming with max settings in the latest titles.

Source: Frank Azor

Go to thread

Don’t Miss Out on More FPS Review Content!

Our weekly newsletter includes a recap of our reviews and a run down of the most popular tech news that we published.

16 comments

  1. I think that a refreshed set of numbers with current bios and driver updates to show the current state of AMD vs NVIDIA card performance would be interesting.

    Wete at s point now where data from 6 months ago doesn't reflect the performance improvements of AMD cards. I can not speak to Nvidia cards. I'm curious to see how they have improved as well. (I suppose once the new set promising double digit performance improvements hits.
  2. I think that a refreshed set of numbers with current bios and driver updates to show the current state of AMD vs NVIDIA card performance would be interesting.

    Wete at s point now where data from 6 months ago doesn't reflect the performance improvements of AMD cards. I can not speak to Nvidia cards. I'm curious to see how they have improved as well. (I suppose once the new set promising double digit performance improvements hits.
    It should be pretty easy to predict, though - with lagging RT performance and no real challenger for DLSS, AMD is at a two-front disadvantage, and those are hardware limitations that they're not going to be remedying for RX6000.

    A bigger concern for consumers is just whether those make any difference, as well as software support for AMD for various content-creation and say AI / ML workloads.

    Being faster in rasterization workloads and generally being available with more VRAM are nice, but they're not the whole picture.
  3. with lagging RT performance and no real challenger for DLSS
    I think one of those is accurate - the other not so much any longer. I'd also question just how pertinent RT performance really is in most gaming scenarios. Sure, there are a few games that use it, but it isn't enough to make me want to throw a lot of extra money at it. At least not yet, in these early generation titles and hardware.

    I certainly prioritize rasterizing performance over all else when I look at gaming --- RT performance is just a bonus, as is DLSS/FSR -- not all games use those, but almost every game uses rasterizing and will continue to do so into the foreseeable future. That said, up until very recently the only thing that outweighed rasterizing performance was actual availability: that appears to at least be getting improved on all fronts.
  4. I think one of those is accurate - the other not so much any longer.
    We can only hope! If AMD can get some inertia behind an alternative, that makes my next laptop choice just that much easier...
  5. I'd just like to see the games the guy chose to represent FPS per dollar spent.

    For almost a year, AMD has been like vaporware............

    The latest Steam Survey lays it all out there pretty well.
    The WHOLE of AMD leads Intel integrated graphics by about 0.4%
    80% of all users have nvidia GPU, but in the entire April survey there is NO meaningful percentage of 3xxx GPUs, nor is there a mention of AMD 6xxx useage at all above 0.05%

    Where are all the expensive cards going? mining?
  6. I think one of those is accurate - the other not so much any longer. I'd also question just how pertinent RT performance really is in most gaming scenarios. Sure, there are a few games that use it, but it isn't enough to make me want to throw a lot of extra money at it. At least not yet, in these early generation titles and hardware.

    I certainly prioritize rasterizing performance over all else when I look at gaming --- RT performance is just a bonus, as is DLSS/FSR -- not all games use those, but almost every game uses rasterizing and will continue to do so into the foreseeable future. That said, up until very recently the only thing that outweighed rasterizing performance was actual availability: that appears to at least be getting improved on all fronts.
    DLSS isn't a bonus. Basically it boils down to this:

    Here is ray tracing - it's cool and can make a game look better (arbitrary rating, it looks 10 "points" better). But our stuff isn't fast enough to actually use it, so here is a work around to make your better looking options look worse (arbitrary deduction of DLSS -8 "points"). Net result of DLSS? Why bother.
  7. Remember when performance was so good from video cards that they were touting being able to run a higher than native resolution then scale then image down to your native resolution because of 'sharpness' or something? This is the inverse of that.
  8. Definitely interesting perspective shared by Frank Azor of AMD


    The 6650XT & 3060ti officially have the same "msrp" But one is classified as 1440p card while the other is classified as 1080p !!

    To put the question, the other way round, if the 3060ti starts selling for $400, then what should be the street price of 6650xt

    Screenshot_20220517-220915_Opera.jpg
  9. DLSS isn't a bonus. Basically it boils down to this:

    Here is ray tracing - it's cool and can make a game look better (arbitrary rating, it looks 10 "points" better). But our stuff isn't fast enough to actually use it, so here is a work around to make your better looking options look worse (arbitrary deduction of DLSS -8 "points"). Net result of DLSS? Why bother.
    This would be true if DLSS were only useful with RT - however, it's useful everywhere. RTX2060? RTX3050? 8K monitor? Laptop with dGPU?

    It's not always worth the tradeoff, but the benefit is there and is real.

    Remember when performance was so good from video cards that they were touting being able to run a higher than native resolution then scale then image down to your native resolution because of 'sharpness' or something? This is the inverse of that.
    Well, supersampling is still being touted in various forms - consider the vast majority of games that remain popular or are becoming popular that do not really stress graphics performance.


    Now, consider being a gamer that plays games that fall into both of the above categories.

    Definitely interesting perspective shared by Frank Azor of AMD


    The 6650XT & 3060ti officially have the same "msrp" But one is classified as 1440p card while the other is classified as 1080p !!

    To put the question, the other way round, if the 3060ti starts selling for $400, then what should be the street price of 6650xt

    View attachment 1599
    Remember that marketing slides are produced by marketing departments.
  10. Agreed - and that's a lot of games, and a lot of gamers. Fair or not, there's real inertia behind DLSS.
    Not to be contrary... ok to be contrary... sure Nvidia has some movement on that front. now that FSR is more mature... AND a solution that can be used on Xbox and Playstation current generation hardware... you don't think it will see speedy adoption?
  11. Agreed - and that's a lot of games, and a lot of gamers. Fair or not, there's real inertia behind DLSS.
    Yeah, there's a good number of games. 129, by last count I could find (here). A lot of them are heavily played AAA games, like Fortnite, Call of Duty, CP2077, and Minecraft.

    For a tech that's hardware locked to generations of GPUs that have been out of reach for most of their release, that isn't shabby. I will give you that nVidia has a very dominiant place in gamer hardware right now, and a lot of gamers are being exposed to DLSS

    But.

    DLSS just passed it's third Birthday (released Feb 2019). There's been a lot of games released since 2019, and apart from some large and notable AAA releases (which likely got promotional support from nVIdia), it hasn't seen broad adoption. 129 is just a drop in the bucket compared to the more than 50,000 games total listed on Steam. I don't know that I'd call it inertia just yet. Promising, yes, but for technology that has drop-in support for both Unreal and Unity, I would expect a lot more support, and lately it seems the buzz has worn off and it just isn't getting picked up. Maybe I'm just not in the right circles to be noticing the buzz, though.

    Maybe as we see GPU availability come back to the realm of sanity, and the mid-lower tiers actually become affordable and attainable (you know, the ones that most people were buying in the first place -- and if those support DLSS.

    ~~~~~

    Some interesting numbers, courtesy of the ever controversial Steam Hardware Survey, April '22 edition. I will only mention one of my takeaways - I'll let everyone come to their own conclusions regarding this, or just disregard it based on the source as you see fit.

    The Top GPU is the 1060, at 7.15%.

    #2 and #3 are the 1660 and 1050Ti, at 6.48% and 5.63%, respectively.

    Of the top 10 GPUs, the most capable / expensive of them all is the 3060, placed at #10, with 2.18%

    Of the top 10 GPUs installed, only 3 are DLSS capable.

    The top 10 GPUs represent a combined ~40% of the total. Shares per card plunge to the 1% range after this.

    The top 20 GPUs represent a combined ~55% of the total. Shares per card plunge to <1% range shortly after this.

    Of all GPUs, approximately 21.5% are DLSS capable. My only take away is this - This number is much higher than I expected it to be (although no where near what it should be, given that a near-bottom tier card three generations old is the #1 card), and does give some evidence that DLSS ~could~ pick up. I just don't see huge adoption in developers to support that yet though. Promising, but not critical mass yet.

Leave a comment

Please log in to your forum account to comment