Introduction

It’s here! It’s here! It finally came!  We now have Ray Tracing support in Cyberpunk 2077 from cd Projekt Red on AMD Radeon RX 6000 series video cards!  It only took more than 6 months for that support, something that should have been ready at the game launch.  With the new patch 1.2 Ray Tracing support for Radeon RX 6000 series has been added.  This means you can turn on Reflection, Shadow, and Lighting Ray Tracing in Cyberpunk 2077 on the Radeon RX 6900 XT, Radeon RX 6800/XT, and Radeon RX 6700 XT.

Cyberpunk 2077 Patch 1.2 Release Notes

This new patch 1.2 actually has a ton of change in the game.  There are a ton of fixes, it’s really quite extensive.  There are also updates to the game in other areas in terms of graphics.  There are improvements in textures rendering from afar.  Visual quality adjustments of elements when underwater.  Improvements in materials details quality.  Improvements for interior and exterior light sources.  Adjust dirt quality.  Improved foliage destruction visuals. Various optimizations to shadows, shaders, physics, animation system, occlusion system, and facial animations.  There’s really a whole lot more here. 

To find out how demanding, and what kind of performance you’ll experience, we decided to go test it.  This is a quick, no-frills, straight to the point, down and dirty simple performance comparison.  We are taking the AMD Radeon RX 6800 XT, and the NVIDIA GeForce RTX 3080 FE and putting them head-to-head in Ray Tracing in Cyberpunk 2077. 

We are also going to find out what kind of performance drop you get enabling the features on each video card.  We are going to test the built-in preset options, as well as manually just enabling Reflections, then Shadows then Lighting to find out how each one individually performs.

Cyberpunk 2077 Ray Tracing Menu

Ok, let’s look at how you can enable Ray Tracing on the Radeon RX 6800 XT.  Similar to the GeForce RTX 3080 FE, you can select different “Quick Preset” options at the top.  If you put this on “Ultra” that means you will have the highest game settings, but no Ray Tracing.

There are two built-in presets for Ray Tracing.  Ray Tracing Medium and Ray Tracing Ultra.  In Ray Tracing Medium Ray-Traced Reflections are kept OFF, while Ray-Traced Shadows is ON and Ray-Traced Lighting is on “Medium.”  When you select Ray Tracing Ultra then everything is enabled, you get Ray-Traced Reflections, Ray-Traced Shadows and Ray-Traced Lighting is on “Ultra.”  The Ray-Traced Lighting option actually has one higher setting called “Psycho” but don’t dare turn this on, it kills performance on pretty much everything.

Since you can also just manually go into each option and turn on just one Ray-Traced feature, to save performance, we are also going to test each one individually.  In this way, we can see which ones take the most burden on GPU performance to run and how they compare.

Resizable BAR

Before we begin, just a couple of notes on the system setup.  For this review, we are using a Ryzen 9 5900X.  We also have enabled Resizable BAR and AMD Smart Access Memory on each video card.  We have applied the latest motherboard BIOS, enabled Resizable BAR in it, and also applied the new VBIOS for the GeForce RTX 3080 FE video card, and we are using the latest driver.  This means Resizable BAR is enabled on both video cards here today as you can see above. We are using the latest drivers for each video card, GeForce 465.89 and AMD Adrenalin 2020 Edition 21.3.2 Optional.

Brent Justice

Brent Justice has been reviewing computer components for 20+ years, educated in the art and method of the computer hardware review he brings experience, knowledge, and hands-on testing with a gamer oriented...

Join the Conversation

13 Comments

  1. So basically, AMD’s RT performance is a swing and a miss for its first attempt. I’m glad they are there but they need to take another stab at the way they go about it, cause the first go-around ain’t all that pretty with respects to performance.

    Thanks for the look into it Brent. Job well done!

  2. Im still waiting to buy a GPU that has ray tracing……so it’s really a moot point.

    AMD needs help with this, it appears.

  3. I’m not too concerned.

    First, as cool as Raytracing is – I don’t play a single game that supports it yet, and even of the ones I am looking at getting, yeah… it would be single digit numbers for the near term. So it’s nice, but it’s far and away not my primary concern. Same thing with DLSS honestly – nice feature, wouldn’t turn it down – not paying extra. Traditional rasterizing performance is still my primary concern.

    For the same amount of money – yeah, I’d rather have better RT than worse RT. But I’m not going to go set out to spend more money just for the sake of better RT. The offerings, at MSRP, between nVidia and AMD – break out a bit oddly for me. The 6700 doesn’t make a lot of sense at all, the 6800 is a rock star, the 6900 not really, the 6800 XT on the fence…. but that’s at MSRP, all other things equal.

    Second – yeah, can’t buy any of these anyway, from any manufacturer. So … like @magoo says, moot point. Not to say this isn’t a good writeup. It’s great info to have. It just won’t influence any of my near purchases because… there’s nothing out there to purchase.

    Third – playing devil’s advocate: it isn’t charted, but I imagine it clocks in about the same as nVidia’s first stab at hardware accelerated RT. Granted, that only goes so far – you can’t really compare two different gens, otherwise we’d be talking about how much better the 11900 is than Bulldozer – just that I didn’t have very high expectations and those expectations were not exceeded by AMD.

    If you really truly care about Raytracing, I guess your choice is clear cut. For the same price and availability, I’d rather have it than not, I fully well admit. But, for instance, I wouldn’t pay more for 20% better raytracing performance but 20% worse rasterization performance – which is about how the line breaks down now, depending on what tier your looking at.

  4. So basically, AMD’s RT performance is a swing and a miss for its first attempt. I’m glad they are there but they need to take another stab at the way they go about it, cause the first go-around ain’t all that pretty with respects to performance.

    I’m going to swing toward giving AMD credit for having gotten the hardware out there in the first place. Same credit I give Nvidia for their 2000-series, the release of which has made possible games with decent RT implementations today. Same credit I’ll give Intel and ARM whenever they get there.

    Having the hardware out there isn’t going to help gamers today, but it’s an install base that’s expanding and it’s a second vendor implementing the function in mass-market consumer hardware, and that helps fence sitters join the crowd of developers supporting RT. Think of it this way: anyone that has waited until there was a market consensus formed on ‘how to do RT’ that wasn’t just ‘The Nvidia Way’ now has their answer. By the time Intel catches up, any deviation from how things are settling now will work against Intel’s product, and so there is a real disincentive for Intel to stray- turning that back around, it becomes highly likely that Intel (and everyone else) is going to go with the flow. Which means that the water’s now safe to jump into.

    First, as cool as Raytracing is – I don’t play a single game that supports it yet, and even of the ones I am looking at getting, yeah… it would be single digit numbers for the near term. So it’s nice, but it’s far and away not my primary concern.

    The tech geek in me wants it to be a major concern for myself, but the basic reality is that of the games I do play, ray tracing is not a killer feature. I’ll note that our perspectives here are intensely personal, of course, and also likely temporary, but they’re also worth consideration and shouldn’t be discounted. Ray tracing is cool, but it hasn’t hit ‘critical mass’ just quite yet, in my opinion.

    Second – yeah, can’t buy any of these anyway, from any manufacturer. So … like @magoo says, moot point. Not to say this isn’t a good writeup. It’s great info to have. It just won’t influence any of my near purchases because… there’s nothing out there to purchase.

    And that’s the kicker ;)

    Many of us could drop a K on a GPU if one worth such an expense were even available, but they’re simply not!

  5. Plays fine for me at 1440p with RT on… I think on medium? I didn’t really check. Not even sure if it’s making a huge difference fidelity or experience wise. Maybe if I was taking screenshots to post on instagram or something?
  6. Plays fine for me at 1440p with RT on… I think on medium? I didn’t really check. Not even sure if it’s making a huge difference fidelity or experience wise. Maybe if I was taking screenshots to post on instagram or something?

    Ray tracing shouldn’t be like turning on ‘salesman mode’ on a TV in your dark basement- done right, you shouldn’t notice it. It’s the absence of immersion-breaking shadow and color artifacts that defines ray tracing, not some in-your-face effect!

  7. Ray tracing shouldn’t be like turning on ‘salesman mode’ on a TV in your dark basement- done right, you shouldn’t notice it. It’s the absence of immersion-breaking shadow and color artifacts that defines ray tracing, not some in-your-face effect!

    It takes so much more GPU power so you don’t notice it. Sounds like a salesman line trying to justify to a customer on the brink to get the XBR model TV back in the days a WEGA TV’s and such. ;)

  8. It takes so much more GPU power so you don’t notice it. Sounds like a salesman line trying to justify to a customer on the brink to get the XBR model TV back in the days a WEGA TV’s and such. ;)

    You notice when things are out of place. RT is designed to put them into place. So yeah, the absence of wacky rasterization artifacts is what you should notice.

  9. You notice when things are out of place. RT is designed to put them into place. So yeah, the absence of wacky rasterization artifacts is what you should notice.

    Well, there has always been two schools of thought with regard to things like this. The TV Analogy @Grimlakin made is pretty good. Speakers/ audio systems would be another good case.

    Some people thing the screen (and in this case by extension, the GPU) is just there to translate the media. You shouldn’t notice anything imparted by the technology, unless it’s deficient somehow in showing the media. You want to have as accurate and precise translation of the media as you can, and the tech is just there to serve it to the end user. This would be like Cinema mode on the TV where it tries to duplicate the theater experience, or my HT Receiver has a Pure Direct option, where it does no post-processing and tries to present the sound exactly as it’s presented by the media.

    Another camp thinks of the tech as part of presentation – it’s not just there as a method to serve the media, it’s part of the experience itself. Crank the bass up, saturate the colors – take the original media and make it bigger and bolder.

    I can see both, and I’m often guilty of both. Right now, I think Raytracing tends to fall into the latter category – not everyone can do it, so you can’t really make a mainstream game rely on RT for anything other than putting in over-the-top effects. Eventually it may fall into the former – maybe as soon as the tail end of this generation of consoles, as RT hardware becomes more ubiquitous and alternative methods of implementing RT work around any lack of hardware.

  10. I think we are quickly going to hit a moment when CPU’s and current and even last gen GPU’s can step up to the plate with a new way to think of the reflections and shadows in the game to render them as part of the scene without additional passes for proper reflections. Doing the reflections in code before passing to the GPU to render as an example. Just making that process ultra efficient and bundling into fewer fatter data streams that the CPU is better at parsing and handling than the GPU would be with it’s millions of parallel processes.
  11. Some people thing the screen (and in this case by extension, the GPU) is just there to translate the media. You shouldn’t notice anything imparted by the technology, unless it’s deficient somehow in showing the media. You want to have as accurate and precise translation of the media as you can, and the tech is just there to serve it to the end user.

    I think perhaps a better way for me to put it would be… you notice these things when you go back. That’s when their absence is felt. It’s the ‘we didn’t know any better’ answer to the future ‘how did we ever even live with that?!?’ question.

    I think we are quickly going to hit a moment when CPU’s and current and even last gen GPU’s can step up to the plate with a new way to think of the reflections and shadows in the game to render them as part of the scene without additional passes for proper reflections. Doing the reflections in code before passing to the GPU to render as an example. Just making that process ultra efficient and bundling into fewer fatter data streams that the CPU is better at parsing and handling than the GPU would be with it’s millions of parallel processes.

    I’ll say that CPUs are definitely not the way forward. Ray tracing, like many other data processing types, is best done on dedicated hardware. I’m not saying that GPUs are that, as they are also definitely not dedicated to that purpose (and can’t be for the foreseeable future, technically speaking), but they’re still several orders of magnitude better than CPUs.

    CPUs do branching code well. Anything that’s not branching, or doesn’t rely on massive in-order instructions (thus single-threaded) is better done on dedicated hardware. CPUs have this already in various forms of SIMD like SSE, AVX, the old MMX, but ray tracing is on a completely different scale.

    As far as alternative means of lighting and refining the process in general, well, that has to happen regardless. Especially if any decent ray tracing will ever happen on the current console generation!

  12. I went to Microcenter on launch day to buy a 6900XT as stock on NVIDIA cards was bad. I figured the RTX 3090 would be the faster option overall, but I figured if we can get 90% of the performance for $500 less, it would be a great option. I’m glad I didn’t get the 6900XT. No DLSS and **** ray tracing performance would have disappointed me. It’s a non-starter for Cyberpunk 2077 at 4K. A game I have hundreds of hours in.
Leave a comment