Image: AMD

During today’s Financial Analysts Day 2020 event, Radeon Technologies Group SVP David Wang had some pretty cool stuff to share regarding AMD’s next-generation GPU architecture, RDNA 2.

RDNA 2 is what will power AMD’s upcoming lineup of Radeon products, as well as next-generation consoles such as the Xbox Series X. While we don’t know the full story behind the PlayStation 5 yet, Microsoft recently confirmed that the Series X utilizes a “custom designed processor leveraging AMD’s latest Zen 2 and RDNA 2 architectures.”

Thanks to RDNA 2, variable rate shading and hardware-accelerated DirectX ray tracing will be supported on the Xbox Series X and future Radeon cards. But what kind of general performance improvements should gamers expect?

According to one slide that Wang shared, RDNA 2 provides a 50-percent performance-per-watt improvement over its predecessor. This is just an “internal estimate,” but the new architecture should prove to be a sizable jump.

Image: AMD

A roadmap of the RDNA architecture was also presented. It confirms some of the major features that RDNA 2 would support – e.g., ray tracing, variable-rate shading – but what’s really interesting is the mention of RDNA 3, which AMD expects to introduce by 2022. Based on the company’s work so far, another 50 percent improvement in perf-per-watt could be in order.

Image: AMD

RDNA 2 might ultimately serve as a turning point for ray-tracing adoption, as the architecture will be at the full disposal of both PC and console developers. AMD says they’ll have more to share about it soon.

Image: AMD

Don’t Miss Out on More FPS Review Content!

Our weekly newsletter includes a recap of our reviews and a run down of the most popular tech news that we published.

Join the Conversation

5 Comments

  1. That’s pretty exciting. I really hope AMD can tilt the crown that Nvidia has been wearing for too long. Only time will tell. Seems everyone is claiming 50% or 100% performance improvements with the next generation.

  2. Idk.

    Here recently we’ve only seen 15-20% between major releases.

    It has hit a bit higher generationally (the typical stagger between the xx80 and xx80Ti, for instance), but maybe 40% if you look generationally and discount the time between major releases, but not 100%

  3. If they really want to take on NV’s current 2080Ti they are going to need at least 40-50% improvement. A very quick, and generalized, search of [URL=’https://gpu.userbenchmark.com/Compare/Nvidia-RTX-2080-Ti-vs-AMD-RX-5700-XT/4027vs4045′]UserBenchmark[/URL] shows a 2080Ti besting a 5700XT by anywhere from 24% to 60%. I hope they do though. We need competition in this sector.

  4. Well , I think everyone (in internet general terms) is too confident on nvidia being able to beat AMD with the 3000 series.
    I mean this may be true but i wouldn’t be sooo sure.
    Amd been immersed in 7nm a long time (in relative terms) AMD architectures didn’t have efficies ironed out that nvidia has for a while and they are finally implementing them with rdna2 meaning nvidia might have less to gain for this generation. Amd also worked out getting higher clocks out of their architectures which may or may not apply to rdna2. Now, yes if nvidia beats them but it takes a 700mm2 massive die with massive bandwidth and a massive price, is that really a win?
    We will see what ‘big navi’ is … But to be honest I want big performance on a budget… Say 600 -700 buck for top end… If AMD comes out with a 600 to 700 card that gives a run for its money to an nvdia ridiulcu-card of massive dies, then good. Maybe some day i will build the computer for vr i dream about but never do.

  5. [QUOTE=”Uvilla, post: 10858, member: 397″]
    Well , I think everyone (in internet general terms) is too confident on nvidia being able to beat AMD with the 3000 series.
    [/QUOTE]

    It’s … possible … that AMD could come out with a market-leading performance card.

    But it’s like betting on a horse that’s half a lap behind. Sure, it’s possible, and it’s happened before, but 9999 out of 10,000 times the horse in the lead is going to win; it’s going to take a miracle. AMD still has a ways to go before they are really competitive with the top tier offerings from nVidia’s current generation, let alone whatever may come out in the next generation later this year.

    nVidia’s architecture just has a very large lead. AMD wasn’t able to beat it even with a process node advantage this generation. I’m not saying AMD can’t close that gap, but to expect AMD to go from as far behind as they are now, to market leading, in the span of just another generation is not a very realistic expectation.

    For the record, I don’t think AMD needs to necessarily be market leading in terms of performance. It’s nice to see, and there’s definitely a halo effect that occurs. A lot of folks here may purchase top tier cards, but I think the middle-lower tiers are far and away the bulk of the volume, and AMD has always been very competitive in those categories. A lot of people, even in enthusiast forums such as this, will poo-poo on AMD for not having a market leading card, even though they themselves don’t even own a top tier card (or real intentions of even purchasing one) and it doesn’t really affect them.

    Even if you look at the CPU side of the house – Zen 1 didn’t really beat Intel at anything apart from price – Intel still had a IPC advantage, and could be competitive with core count (albeit at very high price). But just closing that gap and getting close was enough to generate an awful lot of excitement. It took another two generations for AMD to pull out ahead – and that’s with Intel being stuck on their architecture and not able to really push anything in their own generational increases, and stuck a process node back. nVidia, I don’t think, is stuck in the same rut as Intel is right now, and that will be a much bigger fight for AMD.

    (my personal favorite bias is anytime an nVidia owner has an issue with a card, it’s obviously a power supply problem; but if an AMD owner has a problem, it’s always because of crap drivers)

Leave a comment