Image: CD PROJEKT RED

Is Cyberpunk 2077 the modern Crysis? Tom’s Hardware has shared some early benchmarks for CD PROJEKT RED’s latest masterpiece, and it appears to be extremely demanding at the 4K Ultra setting—so much so that NVIDIA’s flagship GeForce RTX 3090 can’t hit 60 FPS, even with ray tracing turned off. If there’s any title that truly requires DLSS, this appears to be it, as green team’s BFGPU is completely crushed at the native maximum preset when RTX is enabled.

You can check out the benchmarks below, which are mildly confusing based on the many variables and how they’re color coded. But here’s how Tom’s Hardware’s Jarrod Walton has explained them:

For the charts, we’ve color-coded Nvidia in blue, AMD in red, and Nvidia with ray tracing in green. We also have the simulated i7-7700K in light blue. The ray tracing results shown in the main chart are at native resolution—no DLSS. RTM indicates the use of the Ray Traced Medium preset (basically the same as the ultra preset, but with several ray tracing enhancements turned on), and RTU is for Ray Traced Ultra (nearly maxed out settings, with RT reflections enabled and higher-quality lighting).

For the ultra settings, we have a second chart showing DLSS performance on the RTX GPUs. We didn’t test every GPU at every possible option (DLSS Auto, Quality, Balanced, Performance, and Ultra Performance), but we did run the full suite of options on the RTX 3090 just for fun. Native results are in dark blue, DLSS Quality results are in green, DLSS Balanced in light blue, DLSS Performance in lighter red, and DLSS Ultra Performance in dark red. (DLSS Auto uses Quality mode at 1080p, Balanced mode at 1440p, and Performance mode at 4K, so we didn’t repeat those results in the charts.) At 4K, we also included a few results running without ray tracing—indicated by ‘Rast’ in the label (for rasterization) and in light blue.

Cyberpunk 2077 Performance: Medium Preset

Cyberpunk 2077 Performance: Ultra Preset

It’s wild how the GeForce RTX 3090 can barely hit 73 FPS in the 4K Ultra + RTX preset even with DLSS’s new Ultra Performance mode enabled, which was primarily designed for 8K gaming. We’re hearing that CD PROJEKT RED will be releasing a massive 50+ GB 0-day patch that should improve performance somewhat, but it’s pretty clear that Cyberpunk 2077 fans who wish to experience the game at its highest fidelity will need some serious hardware.

Cyberpunk 2077 will be released for the PC, Xbox One, PlayStation 4, and Google Stadia on December 10. NVIDIA’s Game Ready Driver, which will presumably land in a day or two, should also help with the performance.

Join the Conversation

22 Comments

  1. Hahaha…..caught publishing data that doesn’t mean sh!t
    But really now, it’s supposed to be demanding, that’s what used to be the norm.
    How many times did you replay a video game even 5 years ago, on new hardware and get to see how it was designed to look.
    At any rate it’s all vaporware at the present anyway.
    Review sites ought to publish the data on hardware that folks actually HAVE.
  2. They need to review what is released so people whe. They can get one know what they are getting and can make an informed opinion. Beyond that the flavor videos and super overclocking articles.. less helpful.
  3. Lets see what happens after the game actually releases and the 49gb day 1 patch is applied… and new game ready video drivers of course.
  4. Hahaha…..caught publishing data that doesn’t mean sh!t
    But really now, it’s supposed to be demanding, that’s what used to be the norm.
    How many times did you replay a video game even 5 years ago, on new hardware and get to see how it was designed to look.
    At any rate it’s all vaporware at the present anyway.
    Review sites ought to publish the data on hardware that folks actually HAVE.

    Its tomshardware, what did you expect? :LOL: :LOL: :p :p ;) ;)

  5. How many times did you replay a video game even 5 years ago, on new hardware and get to see how it was designed to look.

    Too true! People need to grow up and get a clue. All of the Witcher games absolutely crushed the GPUs of their time. It wasn’t until I got a 2080 Ti I could even play W3 in 4k/60+ FPS with max settings and mods. W2 with ubersampling(essentially 8K) still brings it down but that also has to do with being DX9.0c.

    The only thing I ever doubted, and still doubt for most games, are the recommended/required specs. A thing a reviewer on Tom’s did say is perhaps CDPR’s recommendations were for those looking at a 30-40 FPS experience and I hate to say it but I agree. Publishers do seem to consistently understate, and even withhold, what they mean for PC specs. They seemed woefully under powered for this game and I doubt patching will improve more than maybe 20-30% in terms of FPS. Not to mention, remember when CDPR took a bad rap for the difference in textures from the original W3 footage vs how it looked at release. People can’t have it both ways with wanting the best looking possible game and then expecting current hardware to play it at its best. Sure optimizations will need to be made but there’s more to the story than that.

    As far as calling the 3090 an 8K card, well, most of us already knew that would only apply to games that demanded very little. All the hype with Death Stranding should’ve been a good clue when they talked about the impressive numbers seen in 4K with RTX 2070’s and 2060’s. Those who want more than the Fortnite experience have years to go before a true 8K card is here. Honestly, it wasn’t until the latest gen of cards from AMD and NVIDIA we even really are seeing great 1440p experiences so 4K still has compromises.

  6. But really now, it’s supposed to be demanding, that’s what used to be the norm.
    How many times did you replay a video game even 5 years ago, on new hardware and get to see how it was designed to look.

    Agree! Going waaay back, I remember needing upgrades to really handle, say, the various Quakes and Unreals. Crysis was really the last game that I remember that really pushed hardware – give me titles that are so demanding that they inspire me to want to try multi GPU, as that makes playing with hardware so much more interesting.

  7. Agree! Going waaay back, I remember needing upgrades to really handle, say, the various Quakes and Unreals. Crysis was really the last game that I remember that really pushed hardware – give me titles that are so demanding that they inspire me to want to try multi GPU, as that makes playing with hardware so much more interesting.

    Before The Witcher 3, Metro 2033 was the last one I remember.

  8. Well years back, CPU and GPU generations brought huge improvements. A year and a half old Computer was old, A three year old computer was obsolete. So you could publish a game that pushed hardware hard, and know that the hardware would catch up soon, and it would be a big enough jump that most of your players were going to upgrade shortly.

    Now… you just don’t have that same leap in performance generation over generation. Users are holding onto hardware longer, and why would a publisher push the minimum performance barrier if many (most?) gamers aren’t going to upgrade any time soon to get there?

  9. Well years back, CPU and GPU generations brought huge improvements. A year and a half old Computer was old, A three year old computer was obsolete. So you could publish a game that pushed hardware hard, and know that the hardware would catch up soon, and it would be a big enough jump that most of your players were going to upgrade shortly.

    Now… you just don’t have that same leap in performance generation over generation. Users are holding onto hardware longer, and why would a publisher push the minimum performance barrier if many (most?) gamers aren’t going to upgrade any time soon to get there?

    What motivation do they have to upgrade if they publishers don’t push the engine? I rode my 680 and 1080ti longer than any other video cards that I’ve ever owned, because the games just didn’t really try to push the envelope. I wouldn’t be trying to upgrade my 1080ti except that I want to increase my Folding@Home output – the 1080ti plays pretty much everything at 2650×1600 without issue. Of course, I’m eying a ROG Swift PG32UQX as my next monitor, and if I snag one of those, then I’ll have several generations of buying the best available video card ahead of me.

  10. Well years back, CPU and GPU generations brought huge improvements. A year and a half old Computer was old, A three year old computer was obsolete. So you could publish a game that pushed hardware hard, and know that the hardware would catch up soon, and it would be a big enough jump that most of your players were going to upgrade shortly.

    Now… you just don’t have that same leap in performance generation over generation. Users are holding onto hardware longer, and why would a publisher push the minimum performance barrier if many (most?) gamers aren’t going to upgrade any time soon to get there?

    The generational leaps really were not much different from today. Historically, generational leaps between video cards has averaged 30-40% for the 15 years I have data for. It has actually been trending up over the last few generations, believe it or not. Primarily thanks to the jumps to Pascal, Ampere, and Navi.

    CPU performance is a different story, but there are too many data points for me to want to compile in my free time on that front.

  11. Nothing new here, people want the best graphics but then also want it to run on their integrated potato. Well you can’t have it both ways. I actually prefer a future proof game than one that runs at 60 fps on a mid range card. That means there is untapped potential. Ever since GTAIV I’ve been fighting this battle.
  12. Nothing scientific to report since I didn’t have any OSD running and only had about 10-20 minutes to play but it seemed to play just fine on my 3700x/3090 rig. Check settings and everything defaulted to Ultra, even has 32:9 support out of the box. I left DLSS to auto. Honestly seemed pretty smooth. I’m sure the guys will do an in-depth review but that was my first impression. I haven’t even updated to the game-ready driver yet but probably got the patch. I’ll definitely be playing more this weekend and testing on both my displays as well. DLSS seems to be the more advanced version as it seemed to be working in 5120×1440 and, so far, any game I’ve tested RT on, does not. Can usually tell by it being greyed out along with the huge performance hits. Can’t wait to see some of those funny glitch bugs everyone’s talking about though. :)
  13. Nothing scientific to report since I didn’t have any OSD running and only had about 10-20 minutes to play but it seemed to play just fine on my 3700x/3090 rig. Check settings and everything defaulted to Ultra, even has 32:9 support out of the box. I left DLSS to auto. Honestly seemed pretty smooth. I’m sure the guys will do an in-depth review but that was my first impression. I haven’t even updated to the game-ready driver yet but probably got the patch. I’ll definitely be playing more this weekend and testing on both my displays as well. DLSS seems to be the more advanced version as it seemed to be working in 5120×1440 and, so far, any game I’ve tested RT on, does not. Can usually tell by it being greyed out along with the huge performance hits. Can’t wait to see some of those funny glitch bugs everyone’s talking about though. :)

    I started to notice some visual downgrades when DLSS turned super aggressive going outside with it set to Auto, so I need to do some tweaking of the settings to be able to run it on Balanced. Ray tracing looks very well done here, but it starts to look weird when DLSS goes into Ultra Performance mode. Some post-processed effects and objects also start to look jaggy.

    And by the way, it seems like the game is defaulting to the Ultra preset for almost everyone no matter their hardware configuration. I don’t think the game is detecting your hardware to automatically set recommended settings.

  14. I started to notice some visual downgrades when DLSS turned super aggressive going outside with it set to Auto, so I need to do some tweaking of the settings to be able to run it on Balanced. Ray tracing looks very well done here, but it starts to look weird when DLSS goes into Ultra Performance mode. Some post-processed effects and objects also start to look jaggy.

    And by the way, it seems like the game is defaulting to the Ultra preset for almost everyone no matter their hardware configuration. I don’t think the game is detecting your hardware to automatically set recommended settings.

    I wondered if autodetect was working right. Seemed a little strange considering the performance issue reports. Really looking forward to digging into this weekend.

    edit: I did notice some texture pop in or changes happen as I looked around on the landscape. Could be a little related to what you saw with DLSS as well.

  15. I wondered if autodetect was working right. Seemed a little strange considering the performance issue reports. Really looking forward to digging into this weekend.

    edit: I did notice some texture pop in or changes happen as I looked around on the landscape. Could be a little related to what you saw with DLSS as well.

    We have a discussion going on in the PC Gaming thread that asset loading is probably tied to your drive speed. Seems like even if the game is installed on a PCI-E NVMe drive that there are glitches with assets loading in, and it gets worse the slower the drive you have. The options has a setting for "Slow HDD" if that is any indication.

  16. We have a discussion going on in the PC Gaming thread that asset loading is probably tied to your drive speed. Seems like even if the game is installed on a PCI-E NVMe drive that there are glitches with assets loading in, and it gets worse the slower the drive you have. The options has a setting for "Slow HDD" if that is any indication.

    Wow! Wasn’t aware on that. I’ve go it on my Sabrent PCIe 4.0 drive. This game played a large factor in why/how I designed this rig.

  17. It wasn’t horrible, but noticeable. Curious how those who can use Smart Access Memory are faring?
Leave a comment