Images: AMD

It’s too early to tell how AMD’s Radeon RX 6000 Series graphics cards will cope with ray tracing, but what we can definitely say is that they’ll include pervasive support of the demanding rendering technique.

AMD’s marketing department made that abundantly clear in a new statement today, confirming that the Radeon RX 6900 XT, Radeon RX 6800 XT, and Radeon RX 6800 will support most of today’s ray-traced titles, many of which leverage Microsoft’s DirectX API.

“AMD will support all ray tracing titles using industry-based standards, including the Microsoft DXR API and the upcoming Vulkan raytracing API,” the company told AdoredTV.

AMD also pointed out something else that should be obvious: its RDNA 2 GPUs will not include any level of official support for NVIDIA’s RTX extensions, which is no surprise, being that they were custom developed by NVIDIA.

“Games making of use of proprietary raytracing APIs and extensions will not be supported,” the company noted.

That said, this poses an interesting conundrum, particularly for those of you who are trying to decide between a GeForce or Radeon card based on the level of ray-tracing support that AMD and NVIDIA can provide.

While green team appears to be the immediate winner (it was the first to bring ray tracing to the masses, after all), there’s a slight question mark here in that nobody seems to know how AMD’s Radeon RX 6000 Series will perform in ray-traced games that utilize NVIDIA’s custom Vulkan extensions in games such as Quake II RTX and Wolfenstein Youngblood.

Due to the fact that Vulkan Ray Tracing Extensions are not out yet, NVIDIA’s ray tracing implementation used custom Vulkan extensions in those games. AMD has the option to write extensions for Vulkan to bring ray tracing support for those titles, however, AMD’s recent statement would suggest that the ray-tracing options in these titles will simply be disabled.

We’re wondering whether developers will actually go back and update their titles so Radeon RX 6000 Series users can enjoy them at their intended fidelity. There’s also the slimmer possibility of third-party workarounds (e.g., a mod or wrapper) that might enable RTX effects on red team’s cards.

If AMD’s ray-tracing support does manage to reach parity with NVIDIA, it’ll be interesting to see how the latter responds.

Editor’s Note: The above paragraphs were edited to better clarify the distinction of the methods that ray tracing can be brought into games, the options available to enable it.

Don’t Miss Out on More FPS Review Content!

Our weekly newsletter includes a recap of our reviews and a run down of the most popular tech news that we published.

Join the Conversation

23 Comments

  1. NVIDIA is using DXR and Vulkan Ray Tracing just like AMD. “RTX” is just branding for their middleware that takes advantage of NVIDIA-specific hardware.

    1. In most cases yes but for cyberpunk and the other listed titles they must be using gameworks because they knew amd would have an rt card around it’s launch. Too bad for them I’m upgrading to ryzen very soon and I’m going to upgrade my 2080ti for a 6900xt. That combo performance boost can’t be ignored and neither can the price difference.

      CDPR will likely update the game with support for AMD using dx12 because they are already adding it to the consoles therefore it must work on amd hardware.

  2. [QUOTE=”Armenius, post: 22784, member: 180″]
    NVIDIA is using DXR and Vulkan Ray Tracing just like AMD. “RTX” is just branding for their middleware that takes advantage of NVIDIA-specific hardware.
    [/QUOTE]
    I always wondered about that.

  3. [QUOTE=”ThreeDee, post: 22771, member: 164″]
    AMD in it for the long haul … marathon, not a sprint
    [/QUOTE]
    Everybody is in it for the long haul.

  4. CDPR will definitely update the game to use regular dx12 RT functions. They have to make an amd compatible version for the consoles so…yeah.

    Still a dick move by nVidia (and I own a 2080ti). They must be using some special gameworks api they developed just for themselves to use which means nVidia paid the developers to include it and the developers don’t have access to the code so they can’t optimize it… Just like the tesselation issue in witcher 3.

    Guess I won’t be doing my first cyberpunk run with ray tracing because (if I can get one) I’m buying a 6900xt to take advantage of the performance boost when using 3rd gen ryzen.

  5. [QUOTE=”d0x360, post: 22946, member: 1604″]
    That combo performance boost can’t be ignored
    [/QUOTE]
    What combo performance boost? Even the stickers aren’t the same color. Instead you should get them color-coordinated, with a green GPU and a green CPU ;).
    [QUOTE=”d0x360, post: 22946, member: 1604″]
    CDPR will likely update the game with support for AMD using dx12 because they are already adding it to the consoles therefore it must work on amd hardware.
    [/QUOTE]
    Hopefully it will be that straightforward for them, but since AMDs equivalent of DLSS is still on the drawing board, said port is likely to underperform either in terms of visuals or framerate, if not both.

  6. [QUOTE=”d0x360, post: 22947, member: 1604″]
    CDPR will definitely update the game to use regular dx12 RT functions. They have to make an amd compatible version for the consoles so…yeah.

    Still a dick move by nVidia (and I own a 2080ti). They must be using some special gameworks api they developed just for themselves to use which means nVidia paid the developers to include it and the developers don’t have access to the code so they can’t optimize it… Just like the tesselation issue in witcher 3.

    Guess I won’t be doing my first cyberpunk run with ray tracing because (if I can get one) I’m buying a 6900xt to take advantage of the performance boost when using 3rd gen ryzen.
    [/QUOTE]
    The game is already using “regular dx12 RT functions.” You misunderstand what a middleware is.

  7. [QUOTE=”Armenius, post: 22989, member: 180″]
    The game is already using “regular dx12 RT functions.” You misunderstand what a middleware is.
    [/QUOTE]

    Please explain it. Because honestly at this point when it comes to RT I have no effing clue.

  8. [QUOTE=”Grimlakin, post: 23000, member: 215″]
    Please explain it. Because honestly at this point when it comes to RT I have no effing clue.
    [/QUOTE]
    Being a middleware means it is a set of optimized functions and code paths used by standard APIs to achieve a certain desired outcome. The code was developed by NVIDIA engineers to use DirectX to achieve optimal ray traced effects for NVIDIA hardware. It’s why another middleware NVIDIA develops, GameWorks, can run on AMD hardware but is slower in comparison. In other words, games that use RTX for ray tracing effects already can support ray tracing on other hardware because the underlying API is hardware agnostic.

  9. [QUOTE=”Grimlakin, post: 22770, member: 215″]
    I am all for open standards and history has proven that those win out over time.
    [/QUOTE]
    yeah, like opengl, cl, xl, lol

    No, wait…

    I’m still waiting for Vulkan to take over the world (and it should…)

  10. [QUOTE=”Stoly, post: 23127, member: 1474″]
    yeah, like opengl, cl, xl, lol

    No, wait…

    I’m still waiting for Vulkan to take over the world (and it should…)
    [/QUOTE]
    Same. Unfortunately Vulkan Ray Tracing isn’t out, yet.

    Microsoft sabotaged OpenGL, which is why it didn’t gain traction in games despite often being superior to Direct3D in many ways. My professor for OpenGL at university was Richard Wright, who represented Real 3D on the ARB back in the day, and he always told his incoming classes the story about Microsoft vs. ARB and OpenGL.

  11. [QUOTE=”Armenius, post: 22784, member: 180″]
    NVIDIA is using DXR and Vulkan Ray Tracing just like AMD. “RTX” is just branding for their middleware that takes advantage of NVIDIA-specific hardware.
    [/QUOTE]
    I think vulkan RT isn’t finished yet. Which is why Nv had to create its own extensions like they did with Ogl on other technologies. AMD could do their own or wait till vulkan rt is finished

  12. [QUOTE=”Armenius, post: 23141, member: 180″]
    Same. Unfortunately Vulkan Ray Tracing isn’t out, yet.

    Microsoft sabotaged OpenGL, which is why it didn’t gain traction in games despite often being superior to Direct3D in many ways. My professor for OpenGL at university was Richard Wright, who represented Real 3D on the ARB back in the day, and he always told his incoming classes the story about Microsoft vs. ARB and OpenGL.
    [/QUOTE]
    How did MS sabotaged Ogl? I’m geniunely asking.

    I think Ogl was pretty good but failed to quickly adopt new technologies, hence the need for extensions. It was better and faster than DX but again lack of support and features made it quickly fade out from PC space.

  13. [QUOTE=”Stoly, post: 23127, member: 1474″]
    I’m still waiting for Vulkan to take over the world (and it should…)
    [/QUOTE]
    At least in terms of having support for it, since it’s so close to DX12, and forms a basis for Android and desktop Linux support.
    [QUOTE=”Stoly, post: 23164, member: 1474″]
    I think Ogl was pretty good but failed to quickly adopt new technologies, hence the need for extensions. It was better and faster than DX but again lack of support and features made it quickly fade out from PC space.
    [/QUOTE]
    OpenGL was still hugely focused on commercial work, IIRC, to the point of neglecting the gaming / real-time graphics side of things. And I’m not sure it would have mattered. Once Microsoft went feet-first into gaming, there was little reason for vendors to try to maintain an alternative.
    [QUOTE=”Grimlakin, post: 22770, member: 215″]
    I am all for open standards and history has proven that those win out over time.
    [/QUOTE]
    The ‘open’ standards usually arise after proprietary implementations ‘show the way’. And they don’t always win out, and aren’t always better.

    Sometimes ‘open’ just means unfocused, with a variety of implementations that don’t necessarily work together and have many ‘poor’ examples, such as FreeSync, which is now being standardized by [I]Nvidia[/I], OpenGL which was never a good fit for gaming and has been abandoned (but was [I]also[/I] best supported by Nvidia, especially on Linux!), and speaking of Linux, every attempt at a Linux ‘desktop’ so far.

    Sometimes people just want something that will actually get the job done. Sometimes they actually, god forbid, want the best tool for the job!

    Many times it takes a leader to get that done. Not all the time; Linux, for example, has taken over the world, and I think it’s only a matter of time before Microsoft ports their desktop to the Linux kernel. But it would take someone like Microsoft to actually do that and force some standardization in the stack before it will really be useful across the broad range of end-user applications.

  14. [QUOTE=”Stoly, post: 23164, member: 1474″]
    How did MS sabotaged Ogl? I’m geniunely asking.

    I think Ogl was pretty good but failed to quickly adopt new technologies, hence the need for extensions. It was better and faster than DX but again lack of support and features made it quickly fade out from PC space.
    [/QUOTE]
    Microsoft threatened to not support OpenGL at all in their operating systems if ARB started advertising and marketing it to game companies because they wanted their own API to become the de facto standard in 3D accelerated real-time graphics. It’s why OpenGL was always associated with CAD and other similar productivity software during that time.

    The only time OpenGL really lagged was when the pixel shader pipeline was developed. Microsoft was the first to market with a viable model in DirectX 8, while OpenGL only had fundamental hardware-specific extensions for pixel shaders until the release of GLSL in 2004. Even then, most implementations were using NVIDIA-specific extensions, which is why games like Doom 3 ran horribly on ATi hardware that fell back to the generic fixed functions. By the time OpenGL 4.0 was released in 2010 it had achieved near-feature parity with Direct3D.
    [QUOTE=”LazyGamer, post: 23176, member: 1367″]
    At least in terms of having support for it, since it’s so close to DX12, and forms a basis for Android and desktop Linux support.

    OpenGL was still hugely focused on commercial work, IIRC, to the point of neglecting the gaming / real-time graphics side of things. And I’m not sure it would have mattered. Once Microsoft went feet-first into gaming, there was little reason for vendors to try to maintain an alternative.

    The ‘open’ standards usually arise after proprietary implementations ‘show the way’. And they don’t always win out, and aren’t always better.

    Sometimes ‘open’ just means unfocused, with a variety of implementations that don’t necessarily work together and have many ‘poor’ examples, such as FreeSync, which is now being standardized by [I]Nvidia[/I], OpenGL which was never a good fit for gaming and has been abandoned (but was [I]also[/I] best supported by Nvidia, especially on Linux!), and speaking of Linux, every attempt at a Linux ‘desktop’ so far.

    Sometimes people just want something that will actually get the job done. Sometimes they actually, god forbid, want the best tool for the job!

    Many times it takes a leader to get that done. Not all the time; Linux, for example, has taken over the world, and I think it’s only a matter of time before Microsoft ports their desktop to the Linux kernel. But it would take someone like Microsoft to actually do that and force some standardization in the stack before it will really be useful across the broad range of end-user applications.
    [/QUOTE]
    The perception of OpenGL not being good for gaming was created by Microsoft. As I say above, it really wasn’t until pixel shaders were developed that it lagged behind Direct3D. I think one of the best examples of the contrast between a game using OpenGL and Direct3D was the original Half-Life release. The difference between how that game looked and ran between using DirectX 7 and OpenGL was stark in the latter’s favor.

  15. [QUOTE=”Armenius, post: 23193, member: 180″]
    Microsoft threatened to not support OpenGL at all in their operating systems if ARB started advertising and marketing it to game companies because they wanted their own API to become the de facto standard in 3D accelerated real-time graphics. It’s why OpenGL was always associated with CAD and other similar productivity software during that time.

    The only time OpenGL really lagged was when the pixel shader pipeline was developed. Microsoft was the first to market with a viable model in DirectX 8, while OpenGL only had fundamental hardware-specific extensions for pixel shaders until the release of GLSL in 2004. Even then, most implementations were using NVIDIA-specific extensions, which is why games like Doom 3 ran horribly on ATi hardware that fell back to the generic fixed functions. By the time OpenGL 4.0 was released in 2010 it had achieved near-feature parity with Direct3D.

    The perception of OpenGL not being good for gaming was created by Microsoft. As I say above, it really wasn’t until pixel shaders were developed that it lagged behind Direct3D. I think one of the best examples of the contrast between a game using OpenGL and Direct3D was the original Half-Life release. The difference between how that game looked and ran between using DirectX 7 and OpenGL was stark in the latter’s favor.
    [/QUOTE]

    Hadn’t heard about that one. I do vaguely recall that MS was to drop opengl support on Vista or Win7.

    Thing is that for years opengl lagged even in Linux. AMD terrible drivers didn’t help either. Even Android support was lagging, there’s a reason nvidia remained top performance for so long (using extensions none the less…)

    Back in the quake days, OpenGL was the renderer of choice. Id, valve and epic were big supporters. I would always choose ogl with the Quake, Unreal y Half-life series. But a few years later only Id remained using it (probably when shaders came out)

  16. [QUOTE=”Stoly, post: 23204, member: 1474″]
    Hadn’t heard about that one. I do vaguely recall that MS was to drop opengl support on Vista or Win7.

    Thing is that for years opengl lagged even in Linux. AMD terrible drivers didn’t help either. Even Android support was lagging, there’s a reason nvidia remained top performance for so long (using extensions none the less…)

    Back in the quake days, OpenGL was the renderer of choice. Id, valve and epic were big supporters. I would always choose ogl with the Quake, Unreal y Half-life series. But a few years later only Id remained using it (probably when shaders came out)
    [/QUOTE]
    Good info in this thread. Puts into more detail what I am trying to relate from what was shared with us at university. There was a lot of animosity apparent when Richard Wright talked about it, but there is truth in it. Microsoft left ARB in 2003 because they were no longer interested in collaborating with the board. They would take their own initiative to further develop the DirectX API by working with the industry in their own terms.

    [URL]https://www.overclockers.co.uk/forums/threads/a-brief-history-of-opengl.18573678/[/URL]

  17. [QUOTE=”Armenius, post: 23207, member: 180″]
    Good info in this thread. Puts into more detail what I am trying to relate from what was shared with us at university. There was a lot of animosity apparent when Richard Wright talked about it, but there is truth in it. Microsoft left ARB in 2003 because they were no longer interested in collaborating with the board. They would take their own initiative to further develop the DirectX API by working with the industry in their own terms.

    [URL]https://www.overclockers.co.uk/forums/threads/a-brief-history-of-opengl.18573678/[/URL]
    [/QUOTE]
    Good read, I recall a few things being different but its probably the alzheimer :LOL: :p:rolleyes::rolleyes:

  18. The moment OpenGL lost Microsoft’s support, though, is when it started to die.
    We may even blame Microsoft, but realistically, OpenGL wasn’t going anywhere on its own. Vendors had to push it, developers had to push it, and they didn’t. Same with Vulkan now, and we see the same lack of support; we simultaneously blame Microsoft and Nvidia, while the most support comes from them!

    I think what we find is that gaming is its own ‘game’ when it comes to APIs, let alone hardware. Precision gets tossed for speed, accuracy for optimization. OpenGL was developed for the former, DirectX for the latter, in a minor oversimplification.

    Note that even Apple abandoned OpenGL (and Vulkan!). I also think that we’re at the point where the API simply doesn’t matter as much. We’ve seen decent progress on live DX12 to Vulkan translation, for example. With the low-overhead APIs there’s just not much room for differentiation, and with CPUs as fast as they are, the actual work that needs to be done is fairly minimal.

  19. [QUOTE=”LazyGamer, post: 23236, member: 1367″]
    The moment OpenGL lost Microsoft’s support, though, is when it started to die.
    We may even blame Microsoft, but realistically, OpenGL wasn’t going anywhere on its own. Vendors had to push it, developers had to push it, and they didn’t. Same with Vulkan now, and we see the same lack of support; we simultaneously blame Microsoft and Nvidia, while the most support comes from them!

    I think what we find is that gaming is its own ‘game’ when it comes to APIs, let alone hardware. Precision gets tossed for speed, accuracy for optimization. OpenGL was developed for the former, DirectX for the latter, in a minor oversimplification.

    Note that even Apple abandoned OpenGL (and Vulkan!). I also think that we’re at the point where the API simply doesn’t matter as much. We’ve seen decent progress on live DX12 to Vulkan translation, for example. With the low-overhead APIs there’s just not much room for differentiation, and with CPUs as fast as they are, the actual work that needs to be done is fairly minimal.
    [/QUOTE]

    Even after reading the article, it seems to me Ogl has to blame itself for its demise, too many wrong choices and lack of innovation. Once they got behind DX they really never catched up.

    Vulkan may or may not suffer the same fate, its arguably better than DX12, but again its lagging behind with RT and once again nvidia has to push it with its own extensions.

    Even mobile could remain stagnant as after all these years, there’s really no games that actually push it and mobile gaming may better be suited for streaming. Time will tell…

  20. [QUOTE=”Stoly, post: 23240, member: 1474″]
    Even after reading the article, it seems to me Ogl has to blame itself for its demise, too many wrong choices and lack of innovation. Once they got behind DX they really never catched up.

    Vulkan may or may not suffer the same fate, its arguably better than DX12, but again its lagging behind with RT and once again nvidia has to push it with its own extensions.

    Even mobile could remain stagnant as after all these years, there’s really no games that actually push it and mobile gaming may better be suited for streaming. Time will tell…
    [/QUOTE]
    Agreed. The name may have changed, but seems like the same mistakes are being made. Seems like most partners in Khronos are just sitting on their hands while NVIDIA is once again the only one pulling weight in the group.

Leave a comment