Image: AMD

AMD shocked the graphics world today by unveiling its Radeon RX 6900 XT, Radeon RX 6800 XT, and Radeon RX 6800 cards, which – according to red team’s first-party benchmarks – can trade blows with NVIDIA’s entire GeForce RTX 30 Series lineup (GeForce RTX 3090 included).

Something that CEO Dr. Lisa Su and Scott Herkelman (CVP & GM) seemed awfully secretive about, however, is the Radeon RX 6000 Series’s ray-tracing performance. While the RDNA 2 event showcased plenty of benchmarks that favored the Radeon RX 6900 XT, Radeon RX 6800 XT, and Radeon RX 6800 over the competition, none of them seem to include metrics related to ray tracing.

Luckily, some users on r/AMD have provided a bit of insight on that after scoping out AMD’s new RDNA 2 landing page, which elaborates on the hardware component that Radeon RX 6000 Series GPUs leverage for ray tracing – the Ray Accelerator (RA). According to a diagram, each Compute Unit houses a single RA.

“New to the AMD RDNA 2 compute unit is the implementation of a high-performance ray tracing acceleration architecture known as the Ray Accelerator,” a description reads. “The Ray Accelerator is specialized hardware that handles the intersection of rays providing an order of magnitude increase in intersection performance compared to a software implementation.”

Thanks to the following footnote, which elaborates on how AMD reached that figure, users on r/AMD (e.g., NegativeXyzen) have been able to draw early ray-tracing performance comparisons between the Radeon RX 6800 XT and NVIDIA’s GeForce RTX 30 Series.

“Measured by AMD engineering labs 8/17/2020 on an AMD RDNA 2 based graphics card, using the Procedural Geometry sample application from Microsoft’s DXR SDK, the AMD RDNA 2 based graphics card gets up to 13.8x speedup (471 FPS) using HW based raytracing vs using the Software DXR fallback layer (34 FPS) at the same clocks. Performance may vary.”

As always, there’s plenty of factors that could affect performance, but here’s how the Radeon RX 6800 XT’s ray-tracing prowess looks vs. NVIDIA’s GeForce RTX 3090 and GeForce RTX 3080, courtesy of a comparison shared by VideoCardz.

GPURay -Tracing CoresDXR PerformanceTensor Cores
NVIDIA GeForce RTX 3090 82 749 FPS 328
NVIDIA GeForce RTX 3080 68 630 FPS 272
NVIDIA GeForce RTX 3070 46not tested 184
AMD Radeon RX 6900XT 80not tested
AMD Radeon RX 6800XT 72 471 FPS
AMD Radeon RX 6800 60not tested

Assuming that these numbers are accurate, AMD’s Radeon RX 6800 XT’s ray-tracing performance appears to be 46 percent slower than the GeForce RTX 3090. The Radeon RX 6800 XT is also 29 percent slower than the GeForce RTX 3080 in that regard.

We’ll learn the truth next month when the Radeon RX 6800 XT and Radeon RX 6800 debut on November 18 (the Radeon RX 6900 XT isn’t coming until December 8), but we’re curious about how the disparity in ray-tracing performance might affect those of you who are on the market for a new GPU. Would you pay more strictly for improved ray tracing (and DLSS)? Let us know in the comments.

Don’t Miss Out on More FPS Review Content!

Our weekly newsletter includes a recap of our reviews and a run down of the most popular tech news that we published.

Join the Conversation

45 Comments

  1. First I see the reviews .. then I look at prices .. then I buy .. Ray Tracing performance is not a big deciding factor right now .. shoot, I might even skip this generation as I only really play one game and I only play at 1440p .. for that, my blower style 5700xt is working great.

  2. It is too early to tell. This could be due to an immature driver and/or firmware.

    Be nice if AMD had something to say on the topic. I dont know RT will be a big issue for me. Not much of what I play uses it.

  3. I agree that it’s too early to tell. Optimizations, learning the ins and out of the architecture, drivers and such are all going to play a part in performance over time. Raytracing in games is still in its infancy for AMD and nVidia which is a major factor.

    That said, this is AMD’s first generation of hardware specifically for this and that could be a disadvantage, but not necessarily. Being first does not mean the competition is better even with second generation hardware. At minimum it will have an effect right now because the only hardware out there has been nVidia’s and efforts have been focused on that. This still doesn’t mean AMD will manage to be equivalent to nVidia even after everything has been worked out.

    The fact that AMD is also in consoles plays a part. This could be good or bad again. First generation hardware in consoles is not necessarily going to be good. There are a lot of tradeoffs in console hardware and this could come back to bite AMD. On the upside, because AMD is in the consoles will likely mean more in-depth optimization and work towards getting the most out of console hardware and some of that should bleed over into the PC space.

    This all said, it’s not going to surprise me if AMD falls short of nVidia on RT this round. I actually expect this to be the case. The upcoming Radeons look damn good but the performance/feature deficit AMD has been running in the video card space for a while isn’t something you can easily overcome in every facet in one leap.

    Besides that, I don’t really know how much effect it will really have. RT is still in its infancy and raster is still king. The number of games with any RT features is paltry even after more than two years of promised RT games. I’m of the opinion it’s going to be at least one more new hardware generation before RT sees any real gains as a required feature.

    1. “The number of games with any RT features is paltry even after more than two years of promised RT games.”

      Except AC: Valhalla, every major AAA game that releases at the end of this year will enable ray-tracing.

      Far Cry 6, Dirt 5, CP 2077, GodFall, WoW, Watch Dogs: Legion, Spiderman: MM, COD: BO, etc.

      So, RT is moving foward.

  4. [QUOTE=”GunShot, post: 22177, member: 1790″]
    “The number of games with any RT features is paltry even after more than two years of promised RT games.”

    Except AC: Valhalla, every major AAA game that releases at the end of this year will enable ray-tracing.

    Far Cry 6, Dirt 5, CP 2077, GodFall, WoW, Watch Dogs: Legion, Spiderman: MM, COD: BO, etc.

    So, RT is moving foward.
    [/QUOTE]
    Dude, go back to NVIDIA and stop posting this stupid crap, we all know you’re just an nvidia shill. A snail moves forward to, but nobody calls it a cheetah. Yes, RT is moving forward, it’s been doing that since before I first started working with it in the 90’s… if it was moving backwards I’d be worried. Seriously, you’re either completely obsessed with nvidia or someone is paying you to post your nonsense here. You’ve only got 6 posts and every single one is nonsense about AMD being crap and NVIDIA being the best thing since sliced bread. Seems this release has NVIDIA worried enough to start getting all of their payed posters back into the action, I guess that’s a good sign for AMD :).

    1. Wow! You are disrespecting me because the numbers of RT games disagree with this misinformation?

      You gotta be a lot a fun at your cheese-parties, huh?!

  5. Wow! You are disrespecting me because the numbers of RT games disagree with this misinformation?

    You gotta be a lot a fun at your cheese-parties, huh?!

  6. Lets be honest here. These games that have RT are still 95% raster with a sprinkling of shiny RT added. There isn’t enough power in any GPU on the market to implement any higher levels of RT. Which is why we’re years and generations away from meaningful RT implementations.

  7. [QUOTE=”Riccochet, post: 22188, member: 4″]
    Lets be honest here. These games that have RT are still 95% raster with a sprinkling of shiny RT added. There isn’t enough power in any GPU on the market to implement any higher levels of RT. Which is why we’re years and generations away from meaningful RT implementations.
    [/QUOTE]
    ‘Meaningful’ RT implementations are subjective, which I point out only because we don’t have a good consensus on what actually would be meaningful. That’s an entire series of discussions unto itself :).

    But with respect to mixed workloads between raster and RT, that’s more or less how it has to be. Raster is just plain orders of magnitude more efficient at drawing pixels themselves, so it makes sense to gradually shift work to RT hardware as it becomes available.

    We’ll also not likely see rasterization disappear completely for similar reasons. Even cinematic rendering suites use a hybrid approach, as rasterization even in a limited implementation backed up by healthy RT input can significantly reduce rendering times while producing the same final output.

  8. Big giant headline based on what?

    Some guy on reddit?

    Yeah, Im not buying anything because of what some random guy says.

    It does go to figure that by common sense reasoning AMD is behind in this one aspect.

    But the simple good news is they are finally, once again stepping up to the top tier. Good bad or indifferent, thats good for everything. RT is still early and only a small part of a game to date…..it has obviously great promise…..for that I am anxious.

    But for now, congrats AMD…..but lets see the reviews.

  9. [QUOTE=”GunShot, post: 22177, member: 1790″]
    “The number of games with any RT features is paltry even after more than two years of promised RT games.”

    Except AC: Valhalla, every major AAA game that releases at the end of this year will enable ray-tracing.

    Far Cry 6, Dirt 5, CP 2077, GodFall, WoW, Watch Dogs: Legion, Spiderman: MM, COD: BO, etc.

    So, RT is moving foward.
    [/QUOTE]
    I don’t have the list and don’t particularly care but would you care to look up the list of games nVidia promised RT in with the release of the 2xxx series and see how many of them actually have any RT features much less usable or noticeable ones? I saw it posted recently and it didn’t look pretty. That was over two years ago these games were promised RT features and that list isn’t complete.

    As for future games, I’ll believe it when they deliver it. Even better, I’ll believe it when they deliver it and turning on something other than the most basic settings doesn’t kill performance. New hardware power is going to increase game resource usage for normal performance increases and IQ. There are still plenty of games out now which struggle with higher IQ settings and some aren’t even playable with the highest. The new cards are playing catchup on many of these.

    This doesn’t even take into account that RT performance metrics are almost always based off the most powerful halo card which very few people own. What happens once you start moving down the product stack? We already saw with nVidia’s 2xxx series than basically anything under a 2080Ti was marginal at best with RT and effectively useless on 2060s and 2070s. RT adoption is going to require a top to bottom hardware product stack which can make at least minimal usefulness of features. Neither company is anywhere near that point yet. It’s going to be a minimum of one more architecture advancement before that has a chance of happening and likely at least two. Maybe by the time the consoles do their mid-life refresh with new AMD hardware we might finally see something.

    1. But, RT now has its big DADDY to lead the next frontier for it now.

      The next-gen consoles.

      Like it or not, the consoles are the foundation for today’s and tomorrow’s games, and Sony and MS’s future games are 150% in support for RT games.

      It is what it is.

  10. Benchmarks leaked earlier by VideoCardz showed ray tracing on the 6800XT to be about on par with a 2080 Ti. Seems AMD are doing the bare minimum at the moment. I’ll see what real reviews show, but I’m leaning back toward NVIDIA.
    [QUOTE=”Elf_Boy, post: 22167, member: 438″]
    It is too early to tell. This could be due to an immature driver and/or firmware.

    Be nice if AMD had something to say on the topic. I dont know RT will be a big issue for me. Not much of what I play uses it.
    [/QUOTE]
    I can’t tell you how many times I’ve heard “Just wait for the drivers” over the years of AMD releases. The HD 2900 XT release was a really fun time.

  11. I’d like to know at what level these consoles are implementing ray tracing. Considering they are both using AMD hardware.

    PS5: 36 CU’s
    XB: 52 CU’s
    6800XT: 72 CU’s
    6900XT: 80 CU’s

    If these consoles are what ray tracing will be based on then I can’t imagine either will perform good at all, or the ray tracing will be barely implemented that it won’t be that noticeable.

  12. [QUOTE=”Riccochet, post: 22204, member: 4″]
    If these consoles are what ray tracing will be based on then I can’t imagine either will perform good at all, or the ray tracing will be barely implemented that it won’t be that noticeable.
    [/QUOTE]

    I’m not really expecting a big push for raytracing this generation (except for a few standout attempts by AAA studios). There was an interview with Spencer downplaying the benefit of raytracing (believe he prioritized FPS over RT, iirc).

  13. Couldn’t care any less about ray-tracing technology. Really reminds me of the “PhysX” technology from years ago. Sure it made the games that supported it seem more realistic, but it was never a make or break decision for me. Only way I used it was after my old GPU was capable of handling the PhysX processing when I upgraded to a more powerful GPU.

    Let me know how the cards handle normal graphics maxed out at 1080P and 4K.

    1. I remember a time when gamers use to whine about 720p@60 (and a stable 30 FPS) resolution was more than enough for games. Screw 1080+!

      Now, 1440p@120 is the sweet spot and 40+ FPS is not even playable any more.

      Interesting, huh?

  14. [QUOTE=”LeRoy_Blanchard, post: 22217, member: 137″]
    Couldn’t care any less about ray-tracing technology. Really reminds me of the “PhysX” technology from years ago. Sure it made the games that supported it seem more realistic, but it was never a make or break decision for me. Only way I used it was after my old GPU was capable of handling the PhysX processing when I upgraded to a more powerful GPU.

    Let me know how the cards handle normal graphics maxed out at 1080P and 4K.
    [/QUOTE]
    Same way I feel about it

  15. [QUOTE=”LeRoy_Blanchard, post: 22217, member: 137″]
    Couldn’t care any less about ray-tracing technology. Really reminds me of the “PhysX” technology from years ago. Sure it made the games that supported it seem more realistic, but it was never a make or break decision for me. Only way I used it was after my old GPU was capable of handling the PhysX processing when I upgraded to a more powerful GPU.

    Let me know how the cards handle normal graphics maxed out at 1080P and 4K.
    [/QUOTE]
    Except PhysX is now used nearly everywhere. You just don’t explicitly see it anymore because it is turned on implicitly. More games today use PhysX for their physics simulation than any other middleware out there. Ray tracing will eventually evolve the same way once we get past the initial hump of it being a marketing point.

  16. [QUOTE=”Armenius, post: 22223, member: 180″]
    Except PhysX is now used nearly everywhere. You just don’t explicitly see it anymore because it is turned on implicitly. More games today use PhysX for their physics simulation than any other middleware out there. Ray tracing will eventually evolve the same way once we get past the initial hump of it being a marketing point.
    [/QUOTE]
    It’s also a great example of not worrying about a feature Gen 1. As I remember it, by the time that PhysX was actually used and something you might say “hey, look at this, its cool” in many games, the stand alone PhysX cards were not fast enough to actually do the job. You were much better off not buying a PhysX card and just waiting for it to actually be in games first.

    I think this is the same trend for RT. By the time we actually see games really using RT as a serious feature, neither the 20X0 or 30X0 will be fast enough to actually run it. For this Gen, and maybe even next gen, go for the better raster card and take whatever RT stuff comes with it as a bonus, but not critical feature.

  17. [QUOTE=”Armenius, post: 22223, member: 180″]
    Except PhysX is now used nearly everywhere. You just don’t explicitly see it anymore because it is turned on implicitly. More games today use PhysX for their physics simulation than any other middleware out there. Ray tracing will eventually evolve the same way once we get past the initial hump of it being a marketing point.
    [/QUOTE]
    This is true, but almost nothing uses the Proprietary GPU accelerated PhysX anymore. So why buy into the GPU Proprietary RTX, and just wait for the ubiquitous multiplatform implementation to proliferate?

  18. Not a surprise.

    All the titles that use raytracing today were designed with the Nvidia implementation in mind. More general purpose applications will likely improve things, but no guarantee.

    Either way, I don’t think it is a big deal. I think we are several generations away from Ray Tracing being a minor game option you can switch on and off that is barely noticeable. My priority is still raster performance this gen. Maybe in a few years I’ll be more concerned with RT.

  19. [QUOTE=”GunShot, post: 22222, member: 1790″]
    I remember a time when gamers use to whine about 720p@60 (and a stable 30 FPS) resolution was more than enough for games. Screw 1080+!

    Now, 1440p@120 is the sweet spot and 40+ FPS is not even playable any more.

    Interesting, huh?
    [/QUOTE]

    I don’t know about you, but I’ve been at this for a while. I always remember 60+ being the minimum target.

    Maybe not during the Voodoo 1 era, but certainly since the GeForce 256 launched in 1999.

    I remember in college from 1999 to 2003 when I was a HUGE Counter-Strike addict 60fps was universally considered the bare minimum to play. I was one of the lucky ones. The combination of my GeForce 2 GTS and later GeForce 3 TI500, my 22″ Iiyama Visionmaster pro and my highly overclocked Socket A Athlons allowed me to vsync at 100hz and pretty much get a stable 100fps at 1600×1200, unheard of at the time.

  20. I won’t care about RT for another year or 2. I’ll take a 6800XT with a full coverage water block.

  21. [QUOTE=”Armenius, post: 22223, member: 180″]
    Except PhysX is now used nearly everywhere. You just don’t explicitly see it anymore because it is turned on implicitly. More games today use PhysX for their physics simulation than any other middleware out there. Ray tracing will eventually evolve the same way once we get past the initial hump of it being a marketing point.
    [/QUOTE]

    It’s not the same thing that the standalone “PhysX” cards were nor is it the same as when GPUs were able to first start doing the job on their own (as a second card). As you said, everything has Physics X now as will everything eventually have Ray-tracing without the need to buy specific hardware for it.

    Being first on the block tax is all this is. Therefore, I still couldn’t care any less about it.

  22. [QUOTE=”LeRoy_Blanchard, post: 22256, member: 137″]
    It’s not the same thing that the standalone “PhysX” cards were nor is it the same as when GPUs were able to first start doing the job on their own (as a second card). As you said, everything has Physics X now as will everything eventually have Ray-tracing without the need to buy specific hardware for it.

    Being first on the block tax is all this is. Therefore, I still couldn’t care any less about it.
    [/QUOTE]

    Yeah, It’s Compute based physics now, no longer “PhysX” right? Accomplishes the same goal, but PhysX was acquired by Nvidia and kept proprietary.

    There was some problem with how the code of “PhysX” was written using x87 instructions resulting in it not being easily and efficiently ported. Can’t remember the details.

    And then – of course – Nvidia being Nvidia, they were assholes and blocked dedicated Nvidia PhysX GPU’s from working if the primary GPU was AMD…

  23. [QUOTE=”Zarathustra, post: 22258, member: 203″]
    Yeah, It’s Compute based physics now, no longer “PhysX” right? Accomplishes the same goal, but PhysX was acquired by Nvidia and kept proprietary.

    There was some problem with how the code of “PhysX” was written using x87 instructions resulting in it not being easily and efficiently ported. Can’t remember the details.

    And then – of course – Nvidia being Nvidia, they were *******s and blocked dedicated Nvidia PhysX GPU’s from working if the primary GPU was AMD…
    [/QUOTE]

    I don’t know what the technology is now, but I am pretty sure it’s available on both AMD and Nvidia cards. While I think you can still use “PhysX” on older titles that had it with Nvidia cards I am pretty sure newer titles use the new standard whatever it is. Someone with more GPU knowledge than me would probably know more about it. I don’t keep up with all that.

  24. [QUOTE=”Zarathustra, post: 22258, member: 203″]
    Yeah, It’s Compute based physics now, no longer “PhysX” right? Accomplishes the same goal, but PhysX was acquired by Nvidia and kept proprietary.

    There was some problem with how the code of “PhysX” was written using x87 instructions resulting in it not being easily and efficiently ported. Can’t remember the details.

    And then – of course – Nvidia being Nvidia, they were *******s and blocked dedicated Nvidia PhysX GPU’s from working if the primary GPU was AMD…
    [/QUOTE]
    No, I mean PhysX. It is a middleware solution for physics that is primarily CPU-based these days. Even console games are using it. It is integrated into Unreal Engine and Unity, notably, and runs on every system (PC, Switch, PlayStation 4, Xbox One). “Hardware-accelerated” PhysX that runs on a GPU is rare these days, but still used occasionally.

  25. [QUOTE=”Armenius, post: 22274, member: 180″]
    No, I mean PhysX. It is a middleware solution for physics that is primarily CPU-based these days. Even console games are using it. It is integrated into Unreal Engine and Unity, notably, and runs on every system (PC, Switch, PlayStation 4, Xbox One). “Hardware-accelerated” PhysX that runs on a GPU is rare these days, but still used occasionally.
    [/QUOTE]

    Interesting. I wonder how they finally got around the legacy x87 problem.

  26. [QUOTE=”Riccochet, post: 22204, member: 4″]
    I’d like to know at what level these consoles are implementing ray tracing. Considering they are both using AMD hardware.

    PS5: 36 CU’s
    XB: 52 CU’s
    6800XT: 72 CU’s
    6900XT: 80 CU’s

    If these consoles are what ray tracing will be based on then I can’t imagine either will perform good at all, or the ray tracing will be barely implemented that it won’t be that noticeable.
    [/QUOTE]

    My guess is it’ll be used in very small quantities. My hope is that it’ll be used for lighting (ambient occlusion of sorts) and not used for every single shiny puddle in the street since it brings nothing to the game. You don’t need a ton of rays (comparatively) for ambient occlusion and it gives the most return for the value (in my opinion). This should allow some sort of quality settings (more/less rays for measuring light at specific points) which can scale with the GPU. Relfections the only limit is the # of times the ray can bounce, but you can’t for example only do a reflection once every 10 pixels and expect it to look right. It’s mostly an all or nothing approach that you can only scale by turning it on/off for specific objects. Screen space reflections work pretty well and it doesn’t add that much realism to the game like proper lighting can do.

  27. [QUOTE=”GunShot, post: 22222, member: 1790″]
    I remember a time when gamers use to whine about 720p@60 (and a stable 30 FPS) resolution was more than enough for games. Screw 1080+!

    Now, 1440p@120 is the sweet spot and 40+ FPS is not even playable any more.

    Interesting, huh?
    [/QUOTE]

    3DFX set the standard for 60 FPS “sweet spot” back in the 90s. It hasn’t changed much since then other than going higher. The standard was set to match the displays own refresh rate which on most displays back then were 60hz and today most are still 60hz.

  28. [QUOTE=”GunShot, post: 22222, member: 1790″]
    I remember a time when gamers use to whine about 720p@60 (and a stable 30 FPS) resolution was more than enough for games. Screw 1080+!

    Now, 1440p@120 is the sweet spot and 40+ FPS is not even playable any more.

    Interesting, huh?
    [/QUOTE]
    I remember playing and programming 320×200 and 320×240 (mode 13h unchained, aka x-mode or mode x depending on who you talked to). We’ve come a long way, but pixel density is getting to the point that increases aren’t making as large of a visual difference as it used to. Going form 320×200 to 640×480 was a large difference to my eyes. Going up to 1080p also increased the perceived detail a lot. 1080p to 1440p is a smaller increase but still easily noticeable by most. 4k depends on the screen size on whether it’s noticeable for me. Smaller screens I can’t really tell the difference besides my icons are smaller if I don’t set scaling. 8k seems like a solution looking for a problem at this point. I find things like better contrast, brightness and HDR bring a lot more to a screen than pushing more pixels. I would take a 1440p monitor with HDR over a 4k monitor without as the visual fidelity is much better. Of courser if I can have both, great, but my point was simply pushing more pixels doesn’t always increase the visuals as much as other methods. As you said, it’s a moving target though, 60fps used to be the gold standard, now 120, 144, etc are becoming the norm. But just as pushing higher resolutions is having limited returns, so does pushing higher frames… while most people would notice the difference betwen 30hz and 60hz… not as many would notice 60hz to 120hz.. and even less 144hz to 200+hz. I’m not saying nobody would notice, just the # of people who would benefit decreases significantly. Those are also typically the players who are turning off most of the visuals to get higher frame rates, so visual fidelity isn’t their top priority, fast response times are.

  29. [QUOTE=”Ready4Droid, post: 22285, member: 245″]
    4k depends on the screen size on whether it’s noticeable for me. Smaller screens I can’t really tell the difference besides my icons are smaller if I don’t set scaling.
    [/QUOTE]

    4k for me is all about immersive screen size, not an increase in visual fidelity. I sit at normal desktop computer distance (~2 to 2.5 ft) awa from my 43″ 4k screen, and I love it.

  30. 32″ is the sweet spot for me with 4K. Nice and sharp. Also great size for actual work without snapping your neck at desk distances.

    Anyhow, looking forward to how AMD tackles RT, maybe even turn some of the Raster Elite Purists on to some fun. 🙂

  31. [QUOTE=”Auer, post: 22298, member: 225″]
    32″ is the sweet spot for me with 4K. Nice and sharp. Also great size for actual work without snapping your neck at desk distances.

    Anyhow, looking forward to how AMD tackles RT, maybe even turn some of the Raster Elite Purists on to some fun. 🙂
    [/QUOTE]
    I just can’t go back from my 50″ 4k TV, having 4 1080p rdp screens is a blessing for my job.

  32. There are rumors of AMD RT implementation being inferior/lowerIQ/slower than nvidia. I don’t really expect them to do better than the RTX2080Ti on their first try, but maybe a lightning can strike twice.

  33. [QUOTE=”GunShot, post: 22222, member: 1790″]
    I remember a time when gamers use to whine about 720p@60 (and a stable 30 FPS) resolution was more than enough for games. Screw 1080+!

    Now, 1440p@120 is the sweet spot and 40+ FPS is not even playable any more.

    Interesting, huh?
    [/QUOTE]
    And now console gamers will be able to play at 60fps and even 120fps at 1080p

  34. [QUOTE=”Stoly, post: 22303, member: 1474″]
    There are rumors of AMD RT implementation being inferior/lowerIQ/slower than nvidia. I don’t really expect them to do better than the RTX2080Ti on their first try, but maybe a lightning can strike twice.
    [/QUOTE]
    Th leaks show it outperforming the 2080ti, but falling short on the 3080… If indicative of actual performance it’s about in line with my expectations… Ok for a first try, but not as good as Nvidias second try. Will have to see how it matures a bit and real games get tested. I don’t see how it would lower IQ for a given setting… If your casting out the same rays using the same api, you should be getting the same results. If you meant you have to lower quality settings to get the same speeds this would make sense, but it’s more of an either or, not both.
    I’m mostly excited to see how it does in games that use it for lighting as that’s when RT makes the biggest difference to me.

  35. [QUOTE=”Ready4Droid, post: 22365, member: 245″]
    Th leaks show it outperforming the 2080ti, but falling short on the 3080… If indicative of actual performance it’s about in line with my expectations… Ok for a first try, but not as good as Nvidias second try. Will have to see how it matures a bit and real games get tested. I don’t see how it would lower IQ for a given setting… If your casting out the same rays using the same api, you should be getting the same results. If you meant you have to lower quality settings to get the same speeds this would make sense, but it’s more of an either or, not both.
    I’m mostly excited to see how it does in games that use it for lighting as that’s when RT makes the biggest difference to me.
    [/QUOTE]

    IMO nvidias 2nd try is not much better if at all compared with its 1st try, seeing that games have pretty much the same performance hit when enabling RT. So goes for DLSS which gives about the same performance gains.

    I guess rumors on AMD RT performance/IQ are based on the crappy RT demo AMD showed a few months ago. But based on what they showed on the presentation, I’d say it looks about the same as RTX.

  36. [QUOTE=”Stoly, post: 22384, member: 1474″]
    IMO nvidias 2nd try is not much better if at all compared with its 1st try, seeing that games have pretty much the same performance hit when enabling RT. So goes for DLSS which gives about the same performance gains.

    I guess rumors on AMD RT performance/IQ are based on the crappy RT demo AMD showed a few months ago. But based on what they showed on the presentation, I’d say it looks about the same as RTX.
    [/QUOTE]
    The performance penalty is always going to be similar. NVIDIA have doubled the ray tracing performance over their first try.

  37. [QUOTE=”Armenius, post: 22385, member: 180″]
    The performance penalty is always going to be similar. NVIDIA have doubled the ray tracing performance over their first try.
    [/QUOTE]
    my math doesn’t compute. If RTX performance has improved by 2X, shouldn’t performance hit be much lower? For example in control the RTX2080Ti has about 40% performance drop enabling RT effects, which is about the same as the RTX 3070 or 3080.

  38. [QUOTE=”Stoly, post: 22387, member: 1474″]
    my math doesn’t compute. If RTX performance has improved by 2X, shouldn’t performance hit be much lower? For example in control the RTX2080Ti has about 40% performance drop enabling RT effects, which is about the same as the RTX 3070 or 3080.
    [/QUOTE]
    You are correct, the only way they should remain at 40% is if the 3080 was 2x the speed of the 2080ti, which it isn’t. Otherwise the % difference should be better, although really not by as much as you’re probably thinking.

    Example:
    Game runs at 80fps without RT and -40% (48fps) with RT
    This gives you a frame time of 12.5ms (1000ms/80fps) without RT and 20.833ms (1000ms/48fps) with RT. This equals 8.33ms of time that RT is taken.

    If you have a 3080 that’s 40% faster, we’ll say it runs at 112fps without RT. If RT was twice as fast, it should only take ~4.17ms (8.33ms/2) for RT..
    So, frametime without RT would be 8.93ms (1000ms/112fps)… add in RT time and we get 8.93 + 4.17 = 13.1ms which would end up at (1000/13.1) 76fps… This is about 32% slower… even thought RT time was cut in half. So while it shouldn’t be 40% slower, it isn’t as big of a change as you’re thinking (double RT performance doesn’t mean 20% performance hit). Since RT was a small slice of the actual frame time, doubling it doesn’t make as much of a difference as would doubling raster performance.

    These are similar #’s (maybe a little high actually so the longer raster takes, the less difference RT makes) to what you might see, so you can see that even doubling RT performance still leaves us with a > 30% hit to performance in this made up example.

    Anyways, hope that wasn’t to confusing 🙂 Made sense when I started writing it, lol.

  39. [QUOTE=”Ready4Droid, post: 22407, member: 245″]
    You are correct, the only way they should remain at 40% is if the 3080 was 2x the speed of the 2080ti, which it isn’t. Otherwise the % difference should be better, although really not by as much as you’re probably thinking.

    Example:
    Game runs at 80fps without RT and -40% (48fps) with RT
    This gives you a frame time of 12.5ms (1000ms/80fps) without RT and 20.833ms (1000ms/48fps) with RT. This equals 8.33ms of time that RT is taken.

    If you have a 3080 that’s 40% faster, we’ll say it runs at 112fps without RT. If RT was twice as fast, it should only take ~4.17ms (8.33ms/2) for RT..
    So, frametime without RT would be 8.93ms (1000ms/112fps)… add in RT time and we get 8.93 + 4.17 = 13.1ms which would end up at (1000/13.1) 76fps… This is about 32% slower… even thought RT time was cut in half. So while it shouldn’t be 40% slower, it isn’t as big of a change as you’re thinking (double RT performance doesn’t mean 20% performance hit). Since RT was a small slice of the actual frame time, doubling it doesn’t make as much of a difference as would doubling raster performance.

    These are similar #’s (maybe a little high actually so the longer raster takes, the less difference RT makes) to what you might see, so you can see that even doubling RT performance still leaves us with a > 30% hit to performance in this made up example.

    Anyways, hope that wasn’t to confusing 🙂 Made sense when I started writing it, lol.
    [/QUOTE]
    It was :unsure::unsure:. but no need to overthink it right now.
    On a side note, it seems RTX is quite faster on blender with ampere vs turing… Food for thought.

  40. As a consumer I want a card that meets a few criteria.

    1. 1440p at 75hz with game on maximum settings.
    2. Future improvements with updates and learning.
    3. No hardware limitations as games use more and more video memory.
    4. Actually have a product on store shelves to buy.

    If amd can have stock on shelves to meet buyer demand they will need to write a love letter to Nvidia for all of the market foreplay getting consumers wanting the newer hardware bad. Because come the review releases ill be looking to buy and if amd is on par.. I will buy them… provided they are on store shelves

Leave a comment