AMD Radeon RX 6900 XT FSR

On this page, we will focus on AMD Radeon RX 6900 XT performance specifically. On the next page, we will look at GeForce RTX 3090 performance, and after that, we will compare the two video cards. We are going to look at 1440p and 4K performance with and without Ray Tracing comparing with and without FSR comparing all the FSR quality settings.

1440p

Godfall Radeon RX 6900 XT FSR Performance Graph

In the above graph, we are looking at 1440p and Epic Quality in the game on the Radeon RX 6900 XT. We start with no FSR in the top bar and move down through all the FSR quality settings. As it moves down in quality, performance goes up. It seems each one, even Ultra Quality does improve performance quite a bit in the game, and this is without Ray Tracing.

Turning on Ultra Quality increases performance 17% in the game at 1440p. Turning on Quality further improves performance 3%. Turning on Balanced further improves performance 3%. Performance further improves performance 1%. It seems that there is a fall-off in the benefit after Quality. The benefit of moving to Balanced and Performance hits a wall in terms of performance uplift. The greatest difference is just turning on Ultra Quality.

Godfall Radeon RX 6900 XT 1440p Ray Tracing FSR Performance Graph

In the above graph we are looking at 1440p in Epic Quality on the Radeon RX 6900 XT and we have now turned on Ray Tracing. We can see that turning on FSR also increases performance at 1440p with Ray Tracing turned on. It seems FSR does help with Ray Tracing turned on.

Turning on Ultra Quality improves performance 24%, which is greater than without Ray Tracing. Turning on Quality FSR further improves performance 9%. Turning on Balanced FSR further improves performance 4%. Finally, performance further improves performance 3%. It does seem FSR improves performance to a greater extent with Ray Tracing on than it did without. There’s still not a big performance though between Balanced and Performance.

4K

Godfall Radeon RX 6900 XT 4K FSR Performance Graph

In this graph above we are now running at 4K and Epic Quality on the Radeon RX 6900 XT. We do see that FSR is improving performance at each step, and it seems more so than it did at 1440p, especially between Balanced and Performance. Turning on Ultra Quality FSR improves performance 42%, which is much higher than what we saw at 1440p. Turning on Quality FSR further improves performance 14%. Turning on Balanced FSR further improves performance 11%. Turning on performance further improves performance 8%. We do see a dimensioning return on performance as you decrease the quality, just the move from nothing to Ultra Quality FSR is huge.

Godfall Radeon RX 6900 XT 4K Ray Tracing FSR Performance Graph

In this graph above we are at 4K on the Radeon RX 6900 XT and we have now turned on Ray Tracing. We once again do see that FSR is improving performance down the scale with Ray Tracing. Turning on Ultra Quality FSR improves performance by 48%, this is the highest result we’ve seen with FSR. It really makes the game more playable at 4K with Ray Tracing turned on. Turning on Quality FSR further improves performance by 18%. Turning on Balanced FSR further improves performance by 13%. Turning on performance FSR further improves performance by another 13%. Those are pretty solid performance improvements at each step.

Don’t Miss Out on More FPS Review Content!

Our weekly newsletter includes a recap of our reviews and a run down of the most popular tech news that we published.

Brent Justice

Brent Justice has been reviewing computer components for 20+ years, educated in the art and method of the computer hardware review he brings experience, knowledge, and hands-on testing with a gamer-oriented...

Join the Conversation

36 Comments

  1. [QUOTE=”Stoly, post: 40174, member: 1474″]
    An IQ comparison would’ve been nice.

    Follow up article maybe?
    [/QUOTE]

    Never say never.

    But this article was focused on the performance aspect. Naturally, anything that isn’t the native resolution will be of a lower quality, it just depends by how much of a downgrade in quality, and if it’s noticeable. You enter into FSR knowing the potential of a downgrade in image quality. The point and goal of enabling FSR is to increase performance. That’s the only reason you enable it.

  2. [QUOTE=”Brent_Justice, post: 40176, member: 3″]
    Never say never.

    But this article was focused on the performance aspect. Naturally, anything that isn’t the native resolution will be of a lower quality, it just depends by how much of a downgrade in quality, and if it’s noticeable. You enter into FSR knowing the potential of a downgrade in image quality. The point and goal of enabling FSR is to increase performance. That’s the only reason you enable it.
    [/QUOTE]

    I agree for the most part, thing is all upscaling technologies promise high performance gains vs “minimal” IQ drop. So it would be nice to know to what extent the promise is delivered.

    For better or for worse upscaling is here to stay, everyone and thier mother in law are coming up with their own version be it hardware (DLSS, xeSS) or software (FSR, UE5). It will be interesting to see how they all stack up.

  3. The 6900xt did better than expected. This title might just be an anomaly in that regard though. Who knows.

    The standout to me is that at 4K with RT and high quality FSR/DLSS settings, both cards achieve completely playable framerates.

    [QUOTE=”Brent_Justice, post: 40176, member: 3″]
    Never say never.

    But this article was focused on the performance aspect. Naturally, anything that isn’t the native resolution will be of a lower quality, it just depends by how much of a downgrade in quality, and if it’s noticeable. You enter into FSR knowing the potential of a downgrade in image quality. The point and goal of enabling FSR is to increase performance. That’s the only reason you enable it.
    [/QUOTE]

    I’m interested in seeing this as well.

    I’ve never seen either FSR or DLSS in person, but when DLSS 2.0 was launched, some reviews were suggesting that it was a tradeoff in image quality, and that in some ways it could actually look better/sharper than native.

  4. [QUOTE=”Zarathustra, post: 40230, member: 203″]
    I’ve never seen either FSR or DLSS in person, but when DLSS 2.0 was launched, some reviews were suggesting that it was a tradeoff in image quality, and that in some ways it could actually look better/sharper than native.
    [/QUOTE]
    That’s never been my experience with it. Granted, Cyberpunk 2077 is the only place I need DLSS 2.0 using my RTX 3090. It’s also the only game I play that has ray tracing implemented to a degree that impacts performance negatively to a point where DLSS 2.0 is required to get desirable frame rates.

    In that game, image quality definitely suffers compared to native resolution. I’ve seen a few screenshots of other games where DLSS 2.0 arguable improved some aspect of image quality, but it tends to negatively impact it somewhere else.

  5. Cyberpunk 2077 is a very bad example of DLSS. DLSS relies heavily on TAA implementation in the game, and Cyberpunk 2077’s TAA implementation is notoriously bad. Therefore, that being tied to it, it worsens DLSS from its potential.

    There are other games that demonstrate DLSS better.

    However, that is one thing to note about DLSS, it relies on temporal vectors, so the implementation of that in games will affect its quality. But at least it can do it, whereas FSR is spatial only at the moment.

  6. [QUOTE=”Brent_Justice, post: 41267, member: 3″]
    Cyberpunk 2077 is a very bad example of DLSS. DLSS relies heavily on TAA implementation in the game, and Cyberpunk 2077’s TAA implementation is notoriously bad. Therefore, that being tied to it, it worsens DLSS from its potential.

    There are other games that demonstrate DLSS better.

    However, that is one thing to note about DLSS, it relies on temporal vectors, so the implementation of that in games will affect its quality. But at least it can do it, whereas FSR is spatial only at the moment.
    [/QUOTE]
    Thanks for the explanations on the differences between the two. Most of my gaming in the last couple of years has focused on games with DLSS and I’ve noticed the anomalies throughout the generations. I’ve recently been replaying RE Village and taking a closer note of FSR and its behavior since it wasn’t around when I played it the first time.

  7. Excellent take, also very surprising on 6900XT RT performance. I think as newer games with RT come out, the disparity for AMD and Nvidia on performance will drift closer together. Still I think Nvidia will remain on top due to larger bandwidth memory as well as more compute units. AMD speed demons do help compensate as well as Infinity Cache (really a 256bit memory bus with slower memory being in the ballpark does speak highly on AMD design even beit slower at this time).

  8. [QUOTE=”Grimlakin, post: 41789, member: 215″]
    Nice article. Well presented and easily understood. Plus the content was interesting as well. Thank you.
    [/QUOTE]
    This is one of the few places where they still do it right.

  9. [QUOTE=”noko, post: 41545, member: 69″]
    Excellent take, also very surprising on 6900XT RT performance. I think as newer games with RT come out, the disparity for AMD and Nvidia on performance will drift closer together. Still I think Nvidia will remain on top due to larger bandwidth memory as well as more compute units. AMD speed demons do help compensate as well as Infinity Cache (really a 256bit memory bus with slower memory being in the ballpark does speak highly on AMD design even beit slower at this time).
    [/QUOTE]

    I disagree.
    Godfall only uses DXR for Raytraced Shadows, meaning it is a DXR-lite game, where AMD does better than games using multiple DXR effects.
    If you look at titles doing a LOT more DXR…the pattern is vey clear.
    NVIDIA has a major performance benefit when multiple DXR effects are used at the same time (Shadows, reflections, Lighting, GI) over AMD.
    Games like CyberPunk 2077, Metro Exodus and Control eg.

    If anything this shows AMD tansk performance when not doing “DXR-lite”.

  10. [QUOTE=”Grimlakin, post: 46580, member: 215″]
    I will say that later titles are using a more focused implementation.
    [/QUOTE]
    You need to eleborate on “focused”?
    The trend is quite clear the more more DXR effects, the more NVIDIA and AMD performance seperates.

  11. [QUOTE=”ViewPort, post: 46636, member: 5185″]
    You need to eleborate on “focused”?
    The trend is quite clear the more more DXR effects, the more NVIDIA and AMD performance seperates.
    [/QUOTE]
    What I am saying is games are being made to look as good as they can on console and its eroding the advantage that Nvidia has simply because there isn’t the same depth of use.

  12. [QUOTE=”Grimlakin, post: 46652, member: 215″]
    What I am saying is games are being made to look as good as they can on console and its eroding the advantage that Nvidia has simply because there isn’t the same depth of use.
    [/QUOTE]
    I feel that this is true, and that it seems to follow the collective recoil that the industry had upon the first more ‘fully-featured’ ray tracing implementations such as BF:V, which was deemed a bit excessive versus the appreciable visual benefits.

    However, I would stop short of coaching this in a positive light. There’s no real doubt that AMDs ray tracing implementation is less potent in their RX6000-series versus Nvidia’s RTX 3000-series.

    Whether that matters is absolutely game, system, and user dependent, but we should still be clear in this distinction because we have no reason to suspect that the use of ray tracing will trend down over time. Yes, we’re seeing more limited use in current console games due to the limited ray tracing hardware present in those consoles, and at the same time as developers learn how to develop games with variable levels of ray tracing we can almost certainly expect ‘more involved’ versions that allow for higher settings on current and future hardware that can handle them.

  13. I think, speaking strictly technically – nVidia clearly has a stronger hardware solution. But – AMD has ~a~ solution now, and it’s prominently featured in both major consoles (that seem to be selling like hotcakes but no one can get their hands on).

    Speaking personally – RT just doesn’t excite me. It’s one of those technologies where I can see a difference if I really look in static screenshots, but when I’m playing a game – right now it just doesn’t make a difference and the only appreciable difference between RT on and off is that I can notice the framerate hit.

    Better Raytracing simply isn’t a consideration on if I buy a piece of hardware, or a game. Well, let me take a step back – because that’s assuming you even have a choice in hardware. Right now, it’s take what you can get with respect to hardware.

    Until we get more titles where raytracing is either required or makes a significant impact, I’d expect it to remain a parlor trick. And even if/when it does catch on, I expect it to become something ubiquitously supported in the background, like physics libraries are today. No one cares which hardware can run physics any better than the other now, but 20 years ago we sure geeked out about it.

  14. I’m still excited for the [I]potential[/I] of RT, as it’s that one step closer needed to get realistic shadows, lighting, and color – which are all actually different parts of the same thing (light intensity at particular frequencies, or the lack thereof).

    It’s something that expanded color and luminosity spaces, meaning stuff like Rec.2020 and actual HDR, really need to be truly useful in games, otherwise color banding and shadow inaccuracies will continue to inhibit immersion.

    That all said – even the best hardware solutions run in optimal conditions aren’t there yet. Where I would personally have a preference, if having a preference happened to even be an option, would be for the [I]other[/I] technologies that Nvidia has delivered to market – DLSS and “RTX”-backed audio processing capabilities. DLSS alone is enough of a differentiator, IMO, especially if one is compromising on performance due to price and availability.

  15. What’s happening now is developers are trying various types of rt. Next generation or so will see more features combined artistically.

  16. [QUOTE=”Grimlakin, post: 46652, member: 215″]
    What I am saying is games are being made to look as good as they can on console and its eroding the advantage that Nvidia has simply because there isn’t the same depth of use.
    [/QUOTE]
    That is a very simplistic view on a very complex problem.

    Ray Tracing can happen at different hardware levels:

    Level 0 – Legacy Solutions (CPU’s)
    Level 1 – Software on Traditional GPUs (NVIDIA Pascal)
    Level 2 – Ray/Box and Ray/Tri Testers in Hardware (AMD console GPU’s/RDNA2)
    Level 3 – Bounding Volume Hierarchy (BVH) Processing in Hardware (NVIDIA Turing/Ampere)
    Level 4 – BVH Processing with Coherency Sorting in Hardware
    Level 5 – Coherent BVH Processing with Scene Hierarchy Generator in Hardware

    The next level (Level 4) is regarding this:
    [ATTACH type=”full” alt=”1642060675910.png”]1410[/ATTACH]

    That is where the focus will be.
    Not on “If we only do one DXR effect, the consoles can kinda keep up”
    Several games haven shown an willingness to FAR exceed the DXR features and image quality available on the consoles.
    If you think consoles will “hold” back DXR implementation I suspect you have misunderstood the benefits/scalability of implementing DXR in games.

  17. [QUOTE=”Brian_B, post: 46669, member: 96″]
    I think, speaking strictly technically – nVidia clearly has a stronger hardware solution. But – AMD has ~a~ solution now, and it’s prominently featured in both major consoles (that seem to be selling like hotcakes but no one can get their hands on).

    Speaking personally – RT just doesn’t excite me. It’s one of those technologies where I can see a difference if I really look in static screenshots, but when I’m playing a game – right now it just doesn’t make a difference and the only appreciable difference between RT on and off is that I can notice the framerate hit.

    Better Raytracing simply isn’t a consideration on if I buy a piece of hardware, or a game. Well, let me take a step back – because that’s assuming you even have a choice in hardware. Right now, it’s take what you can get with respect to hardware.

    Until we get more titles where raytracing is either required or makes a significant impact, I’d expect it to remain a parlor trick. And even if/when it does catch on, I expect it to become something ubiquitously supported in the background, like physics libraries are today. No one cares which hardware can run physics any better than the other now, but 20 years ago we sure geeked out about it.
    [/QUOTE]
    I have the opposite experience after playing games with eg. DXR GI, my brain gets “annoyed” when observing normal SSAO effects in other games…something is “off” (as in the lighting is more fake).
    I think Digital Foundry kinda summed it up in eg. this video:
    [MEDIA=youtube]hZp8fXLXgqg[/MEDIA]

    A bit more than a “parlor trick”.

  18. [QUOTE=”ViewPort, post: 46715, member: 5185″]
    If you think consoles will “hold” back DXR implementation I suspect you have misunderstood the benefits/scalability of implementing DXR in games.
    [/QUOTE]
    I think a lot of us still have flashbacks to being stuck at DX9 forever because the consoles had a long stretch. Sure, DX10 and 11 did come about, but games just rarely took advantage of it because the consoles couldn’t.

  19. [QUOTE=”ViewPort, post: 46716, member: 5185″]
    A bit more than a “parlor trick”.
    [/QUOTE]
    To each their own – it doesn’t make the game better for me.

  20. [QUOTE=”Brian_B, post: 46728, member: 96″]
    To each their own – it doesn’t make the game better for me.
    [/QUOTE]
    Unless you set all graphical settings to LOW when you play, I am going to ignore you subjective non-argument…feel free to engage in a technical debate, but “I like pong-style graphics” has no relevance in regard to ray tracing and the perfomance gap (and why that gap is there) between vendors in DXR 🥳

    [QUOTE=”Brian_B, post: 46727, member: 96″]
    I think a lot of us still have flashbacks to being stuck at DX9 forever because the consoles had a long stretch. Sure, DX10 and 11 did come about, but games just rarely took advantage of it because the consoles couldn’t.
    [/QUOTE]
    I remember the DX10 launch quite well….hardware way before any games.
    The G80 came…ran supreme in DX9 games and after a while a lot of DX10 games came out (DX 10 adoption rate was slower than DXR) it was supreme in DX10 too.

    DX11 was a bit more “meh”, but the multi-threaded driver issue did give AMD some issues.

    DX12 was more “huh”….until DXR hit. Before DXR DX12 was lower performance compared to DX11, bye to multi-GPU support and the low level codepath was certainly not for every developer.
    DXR was the saving grace for DX12…hence the adoptation of DXR is better than the one of DX10.

    And here we are…were games does this:
    [ATTACH type=”full”]1415[/ATTACH]

    DXR on PC, nothing on console…this is not a novel trend…this generation of consoles are to weak for full DXR implementation…and for once…developers are not holding back DXR to “spare the consoles”…(except some shady business from the Unreal Enigne 5 team to try and hide the perfomance gap between PC vs. consoles aka “politics”).

  21. [QUOTE=”ViewPort, post: 46821, member: 5185″]
    Unless you set all graphical settings to LOW when you play, I am going to ignore you subjective non-argument…feel free to engage in a technical debate, but “I like pong-style graphics” has no relevance in regard to ray tracing and the perfomance gap (and why that gap is there) between vendors in DXR
    [/QUOTE]
    I don’t think that’s exactly what I meant there. I suspect you knew that, but that’s ok.

    There is nothing in Cyberpunk’s RT implementation, or any other game that I’m aware of, that makes playing the game any different with it on or off – apart from it being more shiny. It’s like the early days of PhysX, where if you had it, Batman’s cape would fly a bit better and explosions were spectacular, but it didn’t change the gameplay. Then, eventually, we got games that relied heavily on physics engines and wouldn’t exist without them – Kerbal, GTA5, Portal, … it’s a pretty long list today and games almost universally have some form of hardware-accelerated physics in their implementation.

    We are ways from that level with RT – that’s what I was trying to convey. Today’s RT is all just more shiny metal and mudpuddle reflections, it has a ways to go to get to something impactful. Not … Pong?

  22. I’d also say – Physics is an example of a tech that caught on, and caught on pretty big. It became pretty standard in every game. Shaders are another.

    But you have a lot that just ended up bullet points on the side of the box so that one manufacturer could claim an edge over another. How many proprietary implementations of Anti-aliasing have we seen come and go? I don’t think anyone will argue the merits of AA, but there have been so many various implementations of it, many of which with vendor lock-ins, that have come and gone. And now that’s extending to upscaling algorithms. Same thing with GPU-accelerated sound – it was a thing for a while if anyone recalls. You can probably put SLI/Crossfire on that list now.

    Raytracing may be the holy grail of rendering, but that doesn’t mean that today is the day for it to emerge, or that it will displace rasterizing entirely in the rendering pipeline. I don’t think it will fail entirely, I think (my personal opinion) that it will follow a similar path as physics, and that eventually it will just become transparent, but I don’t think it will completely displace rasterization. I just think we will stop ranking video cards on just RT merits — either it becomes transparent enough that it gets rolled into the overall performance metric (akin to shaders), or it fades into something that doesn’t heavily leverage the hardware (which is the route physics has).

    Today – my opinion, Raytracing performance is just another marketing point. It makes games prettier, but it doesn’t change gameplay. If one card performs better with RT than another, it doesn’t change your ability to play a game, just turn on some options. RT support is hardly universal across all games, so even the titles to where it can be used are fairly selective. If you have a situation where all other things are equal; sure, you’d rather have better RT performance than worse. But I wouldn’t sacrifice rasterization performance, or VRAM capacity, or price to get better RT performance. Even though RT may very well be the Next Big Thing — if it does, the video cards that are out once it gets there will be vastly superior for it when we get there, which keeps RT performance today a marginal metric at best.

    At least for me. Nothing to do at all with Pong here.

  23. Nice article. I have been looking for a recent apples to apples comparison since I have been looking for a new gpu lately. I am leaning towards a 6900 by now. Mostly because of the ~$1000 price gap between in stock 6900 and 3080ti/3090 of course. But I have been thinking about testing team red again. I believe my previous card from them had ATi branding.

  24. [QUOTE=”ViewPort, post: 46577, member: 5185″]
    I disagree.
    Godfall only uses DXR for Raytraced Shadows, meaning it is a DXR-lite game, where AMD does better than games using multiple DXR effects.
    If you look at titles doing a LOT more DXR…the pattern is vey clear.
    NVIDIA has a major performance benefit when multiple DXR effects are used at the same time (Shadows, reflections, Lighting, GI) over AMD.
    Games like CyberPunk 2077, Metro Exodus and Control eg.

    If anything this shows AMD tansk performance when not doing “DXR-lite”.
    [/QUOTE]
    For RT, there is also a lot of compute operations where Nvidia double FP32 per cuda core gives a great benefit if fed. AMD advantage is if DXR 1.1 is used and Infinity Cache is effectively used, as in multiple shaders within a single shader keeping instructions/assets local in cache for compute operations, keeping the shaders fully used and busy. Nvidia biggest advantage is FP32 capability while AMD in keeping the shaders well fed for instructions (compute mostly) for RT using Infinity Cache. Comparing Older RT games to newer more AMD aware games show that AMD hardware can do RT, better than Turing by a significant margin. It is not just casting rays, BVH etc. for RT, once the hits/data is determined, it is also how well the arch computes the final output as well.

  25. [QUOTE=”noko, post: 47248, member: 69″]
    Comparing Older RT games to newer more AMD aware games show that AMD hardware can do RT, better than Turing by a significant margin.
    [/QUOTE]
    You’ll want to substantiate this – do you have a reference, hopefully more than one, that clearly shows the differences?

    Also, it was clear from the outset with Turing that RT was a compromise. Old node and so on, and Nvidia took heat for that.

    But we’re not comparing RDNA2 to Turing – we’re comparing it to Ampere.

  26. [QUOTE=”LazyGamer, post: 47252, member: 1367″]
    You’ll want to substantiate this – do you have a reference, hopefully more than one, that clearly shows the differences?

    Also, it was clear from the outset with Turing that RT was a compromise. Old node and so on, and Nvidia took heat for that.

    But we’re not comparing RDNA2 to Turing – we’re comparing it to Ampere.
    [/QUOTE]
    Watch whole video, if Lazy, 5 min on should give relevant detail dealing with AMD hardware: (Have to watch on YouTube)

    [MEDIA=youtube]sW0g0lqsqKA[/MEDIA]

    If you want even a dryer version again straight from AMD mouth, how programming particularly for a given hardware base here you go: Note, Nvidia also gives tools, best practice etc. that works best with Nvidia RT hardware. Hence RT performance is also very much dependent on how well it is coded for the hardware. Older RT games are in general not coded best for AMD hardware hence a much larger discrepancy. Nvidia also benefits from DXR 1.1 usage on the newer games that use it.

    [MEDIA=youtube]cC-DDAq3PCM[/MEDIA]

  27. Not to discount the veracity of the videos, but they are from AMD and not from an independent tester. I’d have the same question if they were from Nvidia, or now that Intel is getting into the game, from Intel too.

  28. [QUOTE=”LazyGamer, post: 47264, member: 1367″]
    Not to discount the veracity of the videos, but they are from AMD and not from an independent tester. I’d have the same question if they were from Nvidia, or now that Intel is getting into the game, from Intel too.
    [/QUOTE]
    You expect an independent tester to rewrite code to see if it actually makes a difference? Not likely. Developers would be putting up a fuss if what AMD is promoting or Nvidia makes zero difference. The performance difference using RT between Nvidia and AMD is closer with newer games more of aware of the hardware -> more recent RT titles that also PS5 and/or XBox series X/S optimized (similar RNDA 2 hardware). You don’t have to look at what I say, just look at performance differences. Both Nvidia and AMD takes a hit but now the hit to AMD is much less than it use to be with DXR 1 written games favoring Nvidia.

  29. [QUOTE=”noko, post: 47266, member: 69″]
    Both Nvidia and AMD takes a hit but now the hit to AMD is much less than it use to be with DXR 1 written games favoring Nvidia.
    [/QUOTE]
    You do realize that the logic you’re using here also implies that AMD RDNA2 is simply not as capable of running DXR 1 than Nvidia’s Ampere?

  30. [QUOTE=”LazyGamer, post: 47269, member: 1367″]
    You do realize that the logic you’re using here also implies that AMD RDNA2 is simply not as capable of running DXR 1 than Nvidia’s Ampere?
    [/QUOTE]
    Exactly implied and true. DXR 1 games will not effectively use the Infinity Cache while DXR 1.1 games if program to use the RNDA2 hardware can.

  31. [QUOTE=”noko, post: 47248, member: 69″]
    For RT, there is also a lot of compute operations where Nvidia double FP32 per cuda core gives a great benefit if fed. AMD advantage is if DXR 1.1 is used and Infinity Cache is effectively used, as in multiple shaders within a single shader keeping instructions/assets local in cache for compute operations, keeping the shaders fully used and busy. Nvidia biggest advantage is FP32 capability while AMD in keeping the shaders well fed for instructions (compute mostly) for RT using Infinity Cache. Comparing Older RT games to newer more AMD aware games show that AMD hardware can do RT, better than Turing by a significant margin. It is not just casting rays, BVH etc. for RT, once the hits/data is determined, it is also how well the arch computes the final output as well.
    [/QUOTE]

    I would like to see a game that uses MORE than one DXR effect that follow the “pattern” you just picked out from free fantasy.
    Trend:
    The more DXR effects in-game, the more AMD suffers compared to NVIDIA.

    DXR 1.1 was developed to make up for the defficiencies (lack of hardware BHV acceleration) of the RDNA2/Console GPU’s….not to be “better” than DXR 1 (DXR 1.1 Inline ray tracing does not use separate dynamic shaders or shader tables etc. DXR 1.1 hides access to the acceleration structure (data structure traversal, box, triangle intersection etc.) vs DXR 1.

  32. [QUOTE=”Brian_B, post: 46837, member: 96″]
    I don’t think that’s exactly what I meant there. I suspect you knew that, but that’s ok.

    There is nothing in Cyberpunk’s RT implementation, or any other game that I’m aware of, that makes playing the game any different with it on or off – apart from it being more shiny. It’s like the early days of PhysX, where if you had it, Batman’s cape would fly a bit better and explosions were spectacular, but it didn’t change the gameplay. Then, eventually, we got games that relied heavily on physics engines and wouldn’t exist without them – Kerbal, GTA5, Portal, … it’s a pretty long list today and games almost universally have some form of hardware-accelerated physics in their implementation.

    We are ways from that level with RT – that’s what I was trying to convey. Today’s RT is all just more shiny metal and mudpuddle reflections, it has a ways to go to get to something impactful. Not … Pong?
    [/QUOTE]

    That same flawed argumentation can be used about:
    T&L
    AA
    AF
    Basically ANY image quaility uplift can be “adressed” the same way and is a fallacy to the technology and the benefits it brings to the table.

    And I have used reflections in windows in-game to spot hostiles…I call that a change of gameplay…due to accurate world reflection (not possible with SSR)

  33. [QUOTE=”ViewPort, post: 47520, member: 5185″]
    And I have used reflections in windows in-game to spot hostiles…
    [/QUOTE]
    You can do that without raytracing.

Leave a comment