Image: NVIDIA

Rockstar Games has released its NVIDIA DLSS update for Red Dead Redemption 2 and its online component, Red Dead Online, enabling green team’s deep learning super sampling technology in the popular action-adventure games and giving them a welcome performance boost. According to NVIDIA’s own testing, Red Dead Redemption 2 and Red Dead Online players with GeForce RTX graphics cards can enjoy accelerated performance by up to 45 percent at 4K. Red Dead Redemption 2 and Red Dead Online’s NVIDIA DLSS update can be enabled in the Settings menu under the Graphics section.

Image: NVIDIA

With the aid of NVIDIA DLSS, all GeForce RTX gamers can experience Red Dead Redemption 2’s incredible world with max settings at 1920×1080 at over 60 FPS. At 2560×1440, GeForce RTX 3060 Ti and above users can ride out with over 60 FPS. And at 3840×2160, gamers with a GeForce RTX 3070, or a faster GPU, can enjoy 60 FPS+ max setting gameplay, for the most detailed, immersive, and engrossing Red Dead Redemption 2 and Red Dead Online experience possible.

Source: NVIDIA

Don’t Miss Out on More FPS Review Content!

Our weekly newsletter includes a recap of our reviews and a run down of the most popular tech news that we published.

Join the Conversation

12 Comments

  1. Waiting for the “It looks better than original” crowd. I’m sure it’s better than AMD’s.. but it’s certainly not perfect.

    This is a poor showing on a static image – a good bit of detail lost in this example without even needing a microscope. Maybe it looks better in motion / person though.

  2. Well, the consensus seems to be that DLSS and FSR are pretty close at their highest quality settings, more so at 4k (edge on the nvidia side), but once you go performance mode, there’s no contest.

  3. Downloading the patch on one rig now, the one in my sig, and the one I use for 4K/5120×1440 gaming. I’ll let you know how it goes. This game is so graphically demanding that I usually have to dial things back anyway for a solid 60-100 FPS goal. I’m fine with some IQ compromise as long as it’s not worse than what I’ve already had to do and looking forward to unloading some of it to those tensor cores.

    I’m also looking forward to seeing how the new laptop handles this(130w 3070) for 1080p or 2560×1080 but I’ll be downloading it to that for the 1st time and that’ll take a bit.

  4. I enjoyed the game but outside of messing with DLSS if you have it. I’m not feeling RDR2 as a big keep playing it game. What am I missing?

  5. [QUOTE=”Grimlakin, post: 37671, member: 215″]
    I enjoyed the game but outside of messing with DLSS if you have it. I’m not feeling RDR2 as a big keep playing it game. What am I missing?
    [/QUOTE]
    Never had a console to play the 1st one so I was really looking forward to playing this when it came out on PC but even a 2080 Ti struggled back then at 4K. Things got better with a 3090 but that was relative to going from around 30-40 FPS up to around 60+ with the right settings. I usually like it closer to around a 100 but couldn’t pull that off. That being said, I stopped playing shortly after getting off of the mountain and decided to wait.

  6. I did some testing with the laptop at 1080p/2560×1080 last night. Pretty impressive. Used the ultra preset setting, DLSS balanced and it averaged 74-100 FSP during the benchmark and while a did some strolling around the winter mountain section. I did notice some IQ degradation but not enough to make me not want to use DLSS. It peaked around 133 and the lowest was around 70. I switched DLSS to high performance and gained maybe 10 fps but artifacts became much more apparent. I’ll try it on the 3090/3700x rig tonight in 5120×1440.

    I also tried tweaking a couple of other settings with a minimal amount of FPS(maybe 3 or 4) decreased, turned off motion blur, and maxed a couple of quality settings.

  7. [QUOTE=”Peter_Brosdahl, post: 37693, member: 87″]
    I did some testing with the laptop at 1080p/2560×1080 last night. Pretty impressive. Used the ultra preset setting, DLSS balanced and it averaged 74-100 FSP during the benchmark and while a did some strolling around the winter mountain section. I did notice some IQ degradation but not enough to make me not want to use DLSS. It peaked around 133 and the lowest was around 70. I switched DLSS to high performance and gained maybe 10 fps but artifacts became much more apparent. I’ll try it on the 3090/3700x rig tonight in 5120×1440.

    I also tried tweaking a couple of other settings with a minimal amount of FPS(maybe 3 or 4) decreased, turned off motion blur, and maxed a couple of quality settings.
    [/QUOTE]
    Try the new Sharpen+ filter from the Geforce Experience overlay?
    [URL unfurl=”true”]https://www.techpowerup.com/forums/threads/nvidia-sharpen-filter-things-just-got-more-interesting.283793/[/URL]

  8. [QUOTE=”Brian_B, post: 37662, member: 96″]
    Waiting for the “It looks better than original” crowd. I’m sure it’s better than AMD’s.. but it’s certainly not perfect.

    This is a poor showing on a static image – a good bit of detail lost in this example without even needing a microscope. Maybe it looks better in motion / person though.
    [/QUOTE]
    Your opinion dont count man, monster GPU owner….

  9. [QUOTE=”Auer, post: 37696, member: 225″]
    Try the new Sharpen+ filter from the Geforce Experience overlay?
    [URL unfurl=”true”]https://www.techpowerup.com/forums/threads/nvidia-sharpen-filter-things-just-got-more-interesting.283793/[/URL]
    [/QUOTE]
    I’ll have to check that out. I don’t normally install GE. I stopped years ago when the game optimizations basically fell on its face so I didn’t really have a use for it. This does look interesting. Thanks for the heads up!

  10. DSOG wasn’t overly impressed with it. I’m don’t agree with their assessment that 12 FPS isn’t an impressive gain. Anytime I gain that much I’m usually pretty happy about it but I do agree that’s a far cry from the 40% NV and RS boasted about. I also agree about the aliasing that happens but more so for distance stuff, I noticed the same. It didn’t bother me that much though. I also wonder how much of their bench may have been hampered by the GPU used(a 3080 with only 10 GB of VRAM). That’s fine for 1080p and most 1440p but at 4K this game can get VRAM hungry. When I was testing at 2560×1080 I saw it using between 5-7 GB with the same Ultra settings they used so just imagine what happens at 4K. I’ll let you know what I see when I test it with my 3090 at 4K / 5120×1440.

    [URL unfurl=”true”]https://www.dsogaming.com/pc-performance-analyses/red-dead-redemption-2-dlss-2-2-10-0-benchmarks/[/URL]

  11. I had a few minutes to spare this morning. Well, I don’t believe VRAM is an issue for anyone with more than 8 GB. During the benchmark, using the highest preset with DLSS balanced it mostly held that same 5-7 GB at 5120×1440. However, towards the end of the last test it completely went above 8GB for the final chase. Enjoy the screenshots!

    I still contend that even though aliasing is prevalent it’s worth it for me to get those FPS upwards of 100 which means I can do some tweaking for my own tastes.

Leave a comment