Test Setup

Please read our GPU Test Bench and Benchmarking Refresher for an explanation of our test system, procedures, and goals.  More information on our GPU testing can be found here.  Check out our KIT page where you can see all the components in our test system configuration for reviewing video cards.

NVIDIA provided us a press driver for the GeForce RTX 3080 Founders Edition, this was version 456.16 and we used this driver on every video card.

Brent Justice

Brent Justice has been reviewing computer components for 20+ years, educated in the art and method of the computer hardware review he brings experience, knowledge, and hands-on testing with a gamer oriented...

Join the Conversation

75 Comments

  1. Ok.. So basically if you’re still at 1080p (or even 1440p), this card is overkill. I’ve been holding off on moving to 4k for the longest time now, and it looks like my patience will be rewarded. My plan is to get a couple of 4k monitors, and then this card would be a nice driver for those.

    Brent, excellent review as always! Thank you FPS Review for tremendous coverage!

  2. So it’s true that those leaked benches of Shadow of the Tomb Raider and Far Cry: New Dawn were worst-case scenarios for this comparison. Look like on average the 3080 is around 70% faster than the 2080 at 4K in rasterization and 30% faster than the 2080 Ti. Add ray tracing and it almost hits the "double performance" claim NVIDIA was making. That is almost exactly the same performance improvement we saw with the Pascal generation.

    Can’t wait to see the 3090 review 😁

  3. It seems to me, that the increase in power consumption is larger than the increase in performance. I’d have expected the opposite.

    Too bad Control is still unplayable in 4K with ray tracing, well maybe with the 3090 :D

  4. It seems to me, that the increase in power consumption is larger than the increase in performance. I’d have expected the opposite.

    Too bad Control is still unplayable in 4K with ray tracing, well maybe with the 3090 :D

    +29% power consumption compared to the 2080, with a performance increase of +70%. The increase in performance is larger than the increase in power. Not quite the 1.7x efficiency NVIDIA stated, but close.

    Control is quite playable with ray tracing using DLSS as pointed out in the review, and it looks better than native thanks to DLSS 2.0.

    1. make that 90% promised energy efficiency increase. Seems to me that if nvidia had gone 7nm, it would have pretty much spot on.

      Anyway I think its still great. Huge performance increase for a moderate increase in power draw. I’ll take that. Specially after watching AMD getting nowhere close to nvidias power efficiency. Hopefully tables will finally turn with the RX6000.

      Great time to be a gamer.

  5. +29% power consumption compared to the 2080, with a performance increase of +70%. The increase in performance is larger than the increase in power. Not quite the 1.7x efficiency NVIDIA stated, but close.

    Control is quite playable with ray tracing using DLSS as pointed out in the review, and it looks better than native thanks to DLSS 2.0.

    Mind you the +29% is compared to the entire system wattage, not the GPU. If you could isolate the GPU power consumption the percentage incrase would be much larger. Probably still not 70% but close to 50.

    With DLSS Control is already playable on the 2080ti I was of course referring to running without DLSS. With the claimed double ray tracing performance the 3080 should be able to handle it without DLSS.

  6. Who cares about power draw? Unless you’re trying to use a PSU that’s on the ragged edge of being sufficient.
  7. Who cares about power draw? Unless you’re trying to use a PSU that’s on the ragged edge of being sufficient.

    I was talking with a lot of the guys I play Destiny 2 with last night. Most of them are running units below 700w with stock Core i9-9900K’s and either RTX 2080 or RTX 2080 Ti cards. Any of them thinking about upgrading to the 30 series are contemplating PSU upgrades along with the graphics cards. Most are looking at the 3080 rather than the 3090.

  8. First site I went to when I woke up this morning to see this review and was not disappointed. Answered all my questions and I’m actually surprised at how well this thing performs.

    The thing that surprises me most is the 850 watt PSU recommendation! I figured it was just marketing BS when Nvidia first announced they recommended a 750 at minimum and figured I could run my EVGA 750 a little longer. Glad I waited for the review cause I never would’ve guessed I’d need to account for that in my budget too. But it’s an excuse to buy something new so I’m sure the wife won’t mind an extra $125 on the old credit card lol.

    I also don’t think this would be overkill for 1080p if you’re running a 240 Hz monitor. Judging by the benchmarks this would be about perfect. My rig currently runs Breakpoint at around 100-110 fps with mostly high settings but the 3080 would likely get over 140+ with maxed settings. I think that would be worth the investment.

    I’m gonna buy mine at Best Buy. My local store usually has a good selection of PC parts and I like being able to have an actual store I can take something back to if it craps the bed on me.

  9. I’m planning on picking one up (if I get lucky tomorrow) and running it on my 650 watt Seasonic titanium. It’s currently running a 2700x, 1080ti, and1660 super, so I’m not concerned about replacing both video cards with one 3080. I’ll upgrade the PSU to a 1000W when the next gen Ryzen CPUs drop.

    Now, I also intend on trying to get a 3090 – if I end up with both, I’ll sell the 3080, but I’m a little more wary of the 3090 on that 650w PSU, but the difference is only like 30 watts. I’ll just wait to OC which ever card till I upgrade PSU.

  10. Honestly not as fast as I had hoped but not bad in any way shape or form. I honestly would not understand many 2080 Ti owners making the jump to this and so we wait on the 3090 Reviews!
    Great Review as usual Brent!
  11. Honestly not as fast as I had hoped but not bad in any way shape or form. I honestly would not understand many 2080 Ti owners making the jump to this and so we wait on the 3090 Reviews!
    Great Review as usual Brent!

    Out of curiousity I checked the local classifieds for 2080ti listings, and there aren’t that many, and half of them are listed around $1000, with the lowest at $700, keep on dreaming boys. (I was thinking about picking up a second 2080ti – not anymore)

  12. You may have done a few spot tests and found no real difference, but did changing between PCIE modes 3.x or 4.x make any difference for the cards performance?
  13. Nice review and the card looks nice too. The only issue I have is the performance is a bit lower than it could be due to the use of the 3700X system. I know AMD gives you PCIe 4.0, but it lags about 7-8% (3900XT does) from the Intel CPU’s. A dual test setup would be nice.

  14. Nice review and the card looks nice too. The only issue I have is the performance is a bit lower than it could be due to the use of the 3700X system. I know AMD gives you PCIe 4.0, but it lags about 7-8% (3900XT does) from the Intel CPU’s. A dual test setup would be nice.

    http://[URL]https://www.techpowerup…-rtx-3080-amd-3900-xt-vs-intel-10900k/26.html

    [/URL]

    AMD still lags a bit in IPC compared to Intel, so in resolutions that are more CPU-dependent like 1920×1080 you’re going to see a big difference. The more the GPU struggles, the more that difference disappears. It’s practically non-existent at 4K.

  15. AMD still lags a bit in IPC compared to Intel, so in resolutions that are more CPU-dependent like 1920×1080 you’re going to see a big difference. The more the GPU struggles, the more that difference disappears. It’s practically non-existent at 4K.

    I thought in the current generation that IPC lead was actually on the Ryzen CPU’s side.

    At least according to this techspot article..

    https://www.techspot.com/article/1876-4ghz-ryzen-3rd-gen-vs-core-i9/

    Not saying it’s a huge difference though Intel does lead still in raw GHZ throughput.

  16. I thought in the current generation that IPC lead was actually on the Ryzen CPU’s side.

    At least according to this techspot article..

    https://www.techspot.com/article/1876-4ghz-ryzen-3rd-gen-vs-core-i9/

    Not saying it’s a huge difference though Intel does lead still in raw GHZ throughput.

    In the context of gaming, which is what we’re talking about with the 3080, It is still Intel. This is at 1280×720, so basically as much CPU dependence as you can get on a modern game. No question that Ryzen is better in other types of applications, especially multithreaded ones.

    View attachment 429

  17. I just hope my 650W PSU can hadle it, it’s a seasonic titanium one, otherwise I will need to swap in the 850 from my X299
  18. In the context of gaming, which is what we’re talking about with the 3080, It is still Intel. This is at 1280×720, so basically as much CPU dependence as you can get on a modern game. No question that Ryzen is better in other types of applications, especially multithreaded ones.

    View attachment 429

    To compare IPC you’d have to lock both cpus to the same clocks. This just tells us that a higher clocked cpu is faster.

  19. The indepth review I’ve been looking for! Thanks for making it pretty straight forward and easy to understand even for a newbie such as myself.

    Based on Microsoft FS performance, I’m really hoping MS does continue to optimize the game’s performance. Until then, here’s hoping the new architecture from AMD improves things a bit. Otherwise I’m going to have to start selling blood to buy a Threadripper!

  20. The indepth review I’ve been looking for! Thanks for making it pretty straight forward and easy to understand even for a newbie such as myself.

    Based on Microsoft FS performance, I’m really hoping MS does continue to optimize the game’s performance. Until then, here’s hoping the new architecture from AMD improves things a bit. Otherwise I’m going to have to start selling blood to buy a Threadripper!

    M$ has always developed new releases of Flight Sim to be ahead of current GPU performance levels. It’s by design. So the game has plenty of room to grow over many years. FSX was the same way, as was FS98,

  21. The indepth review I’ve been looking for! Thanks for making it pretty straight forward and easy to understand even for a newbie such as myself.

    Based on Microsoft FS performance, I’m really hoping MS does continue to optimize the game’s performance. Until then, here’s hoping the new architecture from AMD improves things a bit. Otherwise I’m going to have to start selling blood to buy a Threadripper!

    You’re in luck! New patch released today addresses CPU performance impact by preventing interruption of rendering threads, among other things.

    https://www.flightsimulator.com/patch-version-1-8-3-0-is-now-available/

  22. Yeah, it’s time to hand down the 980Ti to the kids computer and get a 3080…if I can manage to actually place an order before they go OoS.
  23. It seems to me, that the increase in power consumption is larger than the increase in performance. I’d have expected the opposite.

    Too bad Control is still unplayable in 4K with ray tracing, well maybe with the 3090 :D

    Well, Control has DLSS 2.0 now right?

    From the comparisons I have seen, DLSS 2.0 really doesn’t result in much of an image quality degradation, and sometimes even looks better, so as long as the 3090 can handle it with DLSS on, I’ll consider that a success.

  24. VP where I work has an Aorus Waterforce 2080 AIO with a 240mm rad. Says it never gets over 60c. And he’s had no issues with it, at least for the last year. It’s just a closed loop system. They work quite well.

    I have dual 360 rads, one 25mm and one 35mm. Bring on the heat.

  25. VP where I work has an Aorus Waterforce 2080 AIO with a 240mm rad. Says it never gets over 60c. And he’s had no issues with it, at least for the last year. It’s just a closed loop system. They work quite well.

    I have dual 360 rads, one 25mm and one 35mm. Bring on the heat.

    Yeah, AIO’S usually get you to the 60’s overclocked and loaded up.

    My WC loop kept my Pascal Titan under 40C overclocked and loaded up. That’s my target because under 40C I seem to have been getting better boost clocks.

    Question is if the temp calculus needs to change considering the massive thermal envelopes of these things.

  26. great review…very detailed…I love the games you tested and the fact that you enabled things like AA, PhysX, Hairworks etc…so you pretty much maxed out the graphics…lots of other 3080 reviews disabled a lot of the advanced graphics settings

    me personally I’m waiting for the 3080 20GB variant…I’m in the process of building a new Zen 3 system so I can afford to be patient

  27. Do Metro Exodus and SotTR still use DLSS 1? If that’s so, do they still exibit the same issues like blur and smear?

    I really hope DLSS2.x becomes a trend. By now there should be dozens of games and patches for DLSS, but still only a handful of games support it, and only a couple of them actually look awesome.
    /rant

  28. Honestly not as fast as I had hoped but not bad in any way shape or form. I honestly would not understand many 2080 Ti owners making the jump to this and so we wait on the 3090 Reviews!
    Great Review as usual Brent!

    Even if this were a huge upgrade over the RTX 2080 Ti, I’d still wait for the 3090. The only way I’d buy a 3080 is if the 3090 was less than 7% faster than a 3080 at twice the price or something stupid like that.

  29. Even if this were a huge upgrade over the RTX 2080 Ti, I’d still wait for the 3090. The only way I’d buy a 3080 is if the 3090 was less than 7% faster than a 3080 at twice the price or something stupid like that.

    20% faster for the price would still be a big NO NO for me even if I could spare the cash. But for people that are already used to pay $1,000+ for a card I guess I can see that happening. And people that already have a RTX2080Ti have nowhere else to go.

  30. 20% faster for the price would still be a big NO NO for me even if I could spare the cash. But for people that are already used to pay $1,000+ for a card I guess I can see that happening. And people that already have a RTX2080Ti have nowhere else to go.

    Enthusiast level cards have always been like this since at least the 8800 Ultra. The 8800 Ultra was about 48% more expensive than a 8800 GTX for 10% more performance.

    Big variable that needs to be considered with the 3090 vs. the 3080, though, is the amount of memory. GDDR6X is supposedly twice as expensive as GDDR6 for 8Gb chips in bulk (around $24/chip compared to $12/chip). That would make the 24GB on the 3090 $576 vs. $240 for the 10GB on the 3080. This is not the only factor accounting for the price difference, but it is a big one.

    If you want the fastest single gaming card available and have the money to buy it, though, then why not.

  31. Flight Simulator 2020 Re-Testing with New Patch
    9/18/2020

    Thank you to Armenius I became aware of this new patch for Flight Sim 2020 which has many adjustments to the performance. Therefore I decided to re-test the game with the new patch on the RTX 3080 (with the same driver) to see if there are any changes. These are my results.

    1440p Ultra – No Change In Performance
    1440p High-End – FPS went from 46 FPS to now 47.8 FPS AVG

    4K Ultra – FPS went from 29 FPS to 31.3 FPS AVG
    4K High-End – FPS went from 42.6 FPS to 46 FPS AVG

    The end result is that in the "High-End" Quality Preset, I saw a larger performance bump with the new patch. 4K "High-End" was the biggest performance leap.

    In the "Ultra" Quality Preset I only saw a very small increase at 4K "Ultra". However, at 1440p "Ultra" there was no difference.

    These are by no means game-changing numbers here, but it is good to see 4K "High-End" performance increasing, I just wish "Ultra" Quality performance increased more.

    It also seems overall, bigger changes at 4K than 1440p.

  32. Thanks, @Brent_Justice for such an in-depth and great review. As always, feel like I’ve been taken back to school. Now just to retain it. Had to wait until tonight until I had time to really read through it.
  33. Enthusiast level cards have always been like this since at least the 8800 Ultra. The 8800 Ultra was about 48% more expensive than a 8800 GTX for 10% more performance.

    Big variable that needs to be considered with the 3090 vs. the 3080, though, is the amount of memory. GDDR6X is supposedly twice as expensive as GDDR6 for 8Gb chips in bulk (around $24/chip compared to $12/chip). That would make the 24GB on the 3090 $576 vs. $240 for the 10GB on the 3080. This is not the only factor accounting for the price difference, but it is a big one.

    If you want the fastest single gaming card available and have the money to buy it, though, then why not.

    I’m aware of the law of diminishing returns. I always try to get the best bang for the buck, which for now IMO is the RTX3080, but will surely get replaced soon by the RTX3070 or RX6000.

    But I agree, whatever makes you happy no matter the cost its fine. Probably if I had the cash, I’d eat my words and end up getting one too 😁😁

  34. But I agree, whatever makes you happy no matter the cost it’s fine. Probably if I had the cash, I’d eat my words and end up getting one too

    I used to mock those who bought Titans for gaming back in Maxwell days. Now, I just start saving for whatever the next biggest hammer will be right after a release happens. Best value, of course, not, best experience, better believe it. It’s also nice seeing these top tier cards usually age gracefully and knowing you’re going to get at least 2 years of top-end performance out of them. My first x80 Ti was a 1080 Ti and it’s still chugging away 4+ years later at a reasonable level. The 2080 Ti I have now, it’ll end up in another rig and still be decent for 1440p for a year or two longer. Initial sticker shock sucks, people jump on the hate trains, but 3-4 years down the road and that same card is doing ok and I think to myself what a great ride it’s been.:giggle:

  35. Even if this were a huge upgrade over the RTX 2080 Ti, I’d still wait for the 3090. The only way I’d buy a 3080 is if the 3090 was less than 7% faster than a 3080 at twice the price or something stupid like that.

    Yeah and we both know a 3080Ti will likely come out.

  36. I don’t think it will. I think it will be a 3080 Super. It seems like NVIDIA is getting away from the "Ti" naming scheme.

    I think he means a faster/beefier version of the 3080, wether its called super, Ti, hyper, ultra, jumbo, is irrelevant.

  37. I think he means a faster/beefier version of the 3080, wether its called super, Ti, hyper, ultra, jumbo, is irrelevant.

    Yeah, from memory of ‘Ti’ and ‘Super’ releases, the only real common thread is that they have better specifications than whatever they are a ‘Ti’ or ‘Super’ of. Could be the same GPU die with faster memory, more memory, more compute resources unlocked, the next largest GPU die, or some combination.

  38. I have to say I’m a little bit dissapointed on DLSS+RTX performance hit as it seems to be comparatively the same as Turing. (about 10-20% depending on the game @4K) I’m getting this figure having as a reference 1440p RTX performance, since this is how its rendered under DLSS.

    I was expecting much better performance as ampere tensor cores are supposedly 3x faster and rtx cores 2x faster than Turing. Some untapped potential, maybe?

  39. I have to say I’m a little bit dissapointed on DLSS+RTX performance hit as it seems to be comparatively the same as Turing. (about 10-20% depending on the game @4K) I’m getting this figure having as a reference 1440p RTX performance, since this is how its rendered under DLSS.

    I was expecting much better performance as ampere tensor cores are supposedly 3x faster and rtx cores 2x faster than Turing. Some untapped potential, maybe?

    I agree.
    But also pretty cool that $699 is legit 4K60fps with just about every game out there.
    Hopefully they can tweak things a bit more down the line.

  40. I have to say I’m a little bit dissapointed on DLSS+RTX performance hit as it seems to be comparatively the same as Turing. (about 10-20% depending on the game @4K) I’m getting this figure having as a reference 1440p RTX performance, since this is how its rendered under DLSS.

    I was expecting much better performance as ampere tensor cores are supposedly 3x faster and rtx cores 2x faster than Turing. Some untapped potential, maybe?

    The full capability for RT will not be seen with the older RTX titles since they used DXR 1.0. Wolfenstein update, which showed a much better spread between the 2080Ti and 3080 represents more the 3080 RT potential pretty sure is using the much better parallel ability of DXR1.1 enhancements.

    I put a decent amount of effort in obtaining this card, Nvidia, Bestbuy, really wanted the FE. Nvidia failed to deliver or sell me one. Been trying since without any luck. If Nvidia cannot take care of their customers then best to move on.

  41. The full capability for RT will not be seen with the older RTX titles since they used DXR 1.0. Wolfenstein update, which showed a much better spread between the 2080Ti and 3080 represents more the 3080 RT potential pretty sure is using the much better parallel ability of DXR1.1 enhancements.

    I put a decent amount of effort in obtaining this card, Nvidia, Bestbuy, really wanted the FE. Nvidia failed to deliver or sell me one. Been trying since without any luck. If Nvidia cannot take care of their customers then best to move on.

    My fear, that regardless of the manufacturer, is that we’ve gotten to a point where scripts rule over the consumer base. Anyone, with enough capital funds could control the market as long as supply is limited at release. There’s no penalty for the bot-world to just buy up anything and everything at launch, as long as there is demand for their resale.

  42. When should we expect a review of the 3090? I am actually contemplating waiting for the 20GB 3080 depending on how the 3090 performs. For the first time in history I feel like the 3080 is "enough" for my gaming resolution and needs, and that paying double for the 3090 is a waste of money :eek:.
  43. When should we expect a review of the 3090? I am actually contemplating waiting for the 20GB 3080 depending on how the 3090 performs. For the first time in history I feel like the 3080 is "enough" for my gaming resolution and needs, and that paying double for the 3090 is a waste of money :eek:.

    From what I can tell, whenever the NDA lifts it’ll probably be a smaller selection of reviewers than what was seen for the 3080. I’d also guess the lift will happen no later than the card on sale date of 9/24 @ 6AM PDT. At this point, we don’t have one nor do we have any confirmed in the pipeline. As with the 3080, I’ll be F5’ing to try to get one when they launch and we’ll continue to shake down manufacturers for one…

  44. When should we expect a review of the 3090? I am actually contemplating waiting for the 20GB 3080 depending on how the 3090 performs. For the first time in history I feel like the 3080 is "enough" for my gaming resolution and needs, and that paying double for the 3090 is a waste of money :eek:.

    I’m kind of feeling the same. Haven’t had this much ambivalence in a while. For me the real decision will be pricing. It it costs over $1000 then I’ll still go for the 3090. Whether or not it’s DDR6X could be also be a factor.

  45. When should we expect a review of the 3090? I am actually contemplating waiting for the 20GB 3080 depending on how the 3090 performs. For the first time in history I feel like the 3080 is "enough" for my gaming resolution and needs, and that paying double for the 3090 is a waste of money :eek:.

    It is certainly a weird position to be in.

    While there are uses for the, uh, ‘excess’ performance, they don’t seem to merit significant increases in costs.

    Feels kind of like we’re on a divide, where more performance isn’t really useful for pure rasterization on desktops, but alsol isn’t nearly enough for say VR or RT (or both).

  46. It is certainly a weird position to be in.

    While there are uses for the, uh, ‘excess’ performance, they don’t seem to merit significant increases in costs.

    Feels kind of like we’re on a divide, where more performance isn’t really useful for pure rasterization on desktops, but alsol isn’t nearly enough for say VR or RT (or both).

    Only thing really holding me back from a 3080 right now is the 10GB memory simply because I’ve seen the 11GB on my 2080 Ti maxed out in a few games at 4K resolution. And it wasn’t simply usage, in those cases, having experienced degraded performance before turning down texture quality or other settings to reduce VRAM needed. Most games fall into the 6-8GB range right now, but I’m just worried that more games will be coming down the pipe that start running into limitations with 10GB. I do understand that they probably could not hit their $700 target if they added more, though. I can see the 20GB version being $1,000 or close to it unless Micron will have 16Gb chips ready when it hits production.

  47. 10 is still more than 8, remember, the 3080 isn’t an upgrade from the 2080 ti, it’s an upgrade from the 2080/super, if you have a 2080 Ti i’d recommend keeping it, the 3090 is really closer to the 2080 ti replacement, but maybe there will be a middle card in the future, or there is of course the more expensive 20GB 3080 option

    as for games utilizing more VRAM, well, I’m not sure what the trend will be, if DLSS is used more, that’s the answer to the VRAM capacity problem, it will alleviate so much pressure on capacity when used

    games are also constantly developer new compression methods, and ways to load balance everything correctly, with RTX I/O and Microsoft DirectStorage decompression should be a lot better and again the vram capacity issue won’t be such a problem

    I know your concerns for sure, and it will really all depends on the games themselves, but I do implore you, if a new game supports DLSS, give it a try, I’m actually liking the technology now, I’ve used it now, and DLSS 2.0 gives you good image quality and a perf increase

  48. 10 is still more than 8, remember, the 3080 isn’t an upgrade from the 2080 ti, it’s an upgrade from the 2080/super, if you have a 2080 Ti i’d recommend keeping it, the 3090 is really closer to the 2080 ti replacement, but maybe there will be a middle card in the future, or there is of course the more expensive 20GB 3080 option

    as for games utilizing more VRAM, well, I’m not sure what the trend will be, if DLSS is used more, that’s the answer to the VRAM capacity problem, it will alleviate so much pressure on capacity when used

    games are also constantly developer new compression methods, and ways to load balance everything correctly, with RTX I/O and Microsoft DirectStorage decompression should be a lot better and again the vram capacity issue won’t be such a problem

    I know your concerns for sure, and it will really all depends on the games themselves, but I do implore you, if a new game supports DLSS, give it a try, I’m actually liking the technology now, I’ve used it now, and DLSS 2.0 gives you good image quality and a perf increase

    DLSS is great, I agree, but unfortunately I do not think it will become ubiquitous.

  49. These are just rumors, but rumors are AMD will have something similar to DLSS coming.

    If that can happen, and maybe some form of standard API can be achieved, then maybe it will be used more.

    Like Ray Tracing, someone had to get the ball rolling first.

  50. DLSS made enough of a difference to me personally that I simply would not buy a GPU without it.
    Control and Wolfenstein:YB alone was worth the price of admission to play in 4K with a RTX2070.

    I was originally going to buy a R7. Glad I didnt.
    RTX and DLSS was way more fun and useful to me that an extra 8GB of ram could have ever been.

    Maybe I should just get a 3090 this time around and game on for the next 3 years, it seems very likely that 2080ti owners will get 3 years out theirs.
    Something to be said about buying the best available stuff…

  51. These are just rumors, but rumors are AMD will have something similar to DLSS coming.

    If that can happen, and maybe some form of standard API can be achieved, then maybe it will be used more.

    Like Ray Tracing, someone had to get the ball rolling first.

    Is Contrast Adaptive Sharpening not AMD’s version of DLSS?

  52. Only thing really holding me back from a 3080 right now is the 10GB memory simply because I’ve seen the 11GB on my 2080 Ti maxed out in a few games at 4K resolution. And it wasn’t simply usage, in those cases, having experienced degraded performance before turning down texture quality or other settings to reduce VRAM needed. Most games fall into the 6-8GB range right now, but I’m just worried that more games will be coming down the pipe that start running into limitations with 10GB.

    I do feel the same way; it’s not even that the 3080 has less than the 2080 Ti (which as noted by others, the 2080 should be the point of comparison), but that memory didn’t increase much.

    I do understand that they probably could not hit their $700 target if they added more, though. I can see the 20GB version being $1,000 or close to it unless Micron will have 16Gb chips ready when it hits production.

    I kind of feel like it’s worth waiting. Part of that at least is coming from a 1080Ti and not really wanting to go backward in VRAM capacity particularly given how long I’m likely to keep the new card.

    as for games utilizing more VRAM, well, I’m not sure what the trend will be, if DLSS is used more, that’s the answer to the VRAM capacity problem, it will alleviate so much pressure on capacity when used

    games are also constantly developer new compression methods, and ways to load balance everything correctly, with RTX I/O and Microsoft DirectStorage decompression should be a lot better and again the vram capacity issue won’t be such a problem

    As much as I admire upcoming solutions to the VRAM problem… these are ‘high-end’ solutions that require significant developer support. I can’t help but imagine that there might be games that slip through the cracks which wind up benefiting from the increased VRAM due to lack of optimization.

    That’s also compounded by waiting every other generation or so to upgrade in my case. More frequent upgraders probably have less to worry about!

  53. Is Contrast Adaptive Sharpening not AMD’s version of DLSS?

    No, that’s closer to NVIDIA’s Sharpening filter in the control panel

    https://nvidia.custhelp.com/app/ans…-image-sharpening-in-the-nvidia-control-panel

    DLSS uses AI (Tensor Cores) to take an image and scale it upwards by like 16x samples, so it renders at a lower resolution but is upscaled by AI to a baseline highly super-sampled image processed by NVIDIA servers offline. It’s much more complex.

    This is why NVIDIA’s method provides faster performance, cause it’s rendering at a lower resolution and then uses hardware to basically upscale it to a reference image with no loss in performance doing so.

    AMD’s method still renders at the same resolution, and there is no AI upscaling. It doesn’t improve performance, only sharpens image quality when temporal antialiasing is used.

    Now, there is supposed to be a feature of CAS that can scale an image. However I don’t know an example of it, and you really don’t hear about performance increases when CAS is used. The video card is not using AI to upscale, cause there’s no method for that ATM. That’s what sets DLSS far apart from CAS, it’s much more functional.

    However, I need to read into CAS a bit more, I’m not 100% on how it exactly works, I need to read a whitepaper or something. But so far, it hasn’t been marketed as a feature to improve performance, but only to improve image quality.

    It’s quite possible AMD could make a new version of CAS that is DLSS like when they have the hardware to do so. Or, they could brand their "DLSS" equivalent into a whole new feature name. Who knows, but the rumor is AMD will be coming out with something DLSS ‘like and I’m not sure that’s CAS.

  54. I thought FidelityFX was closer to DLSS than CAS?

    [/URL][/URL]

    FidelityFX is a suite of technologies, branded under the FidelityFX name. There are many features branded under that name.

    There is:

    FidelityFX Contrast Adaptive Shading
    FidelityFX Screen Space Reflections
    FidelityFX Combined Adaptive Compute Ambient Occlusion
    FidelityFX Lumincanace Preserver Mapper
    FidelityFX Single Pass Downsampler

    and more

    So a game can have only 1 of these features and still be called having FidelityFX technology, or it can have multiple of these features.

    So the thing to look for in games is which one of these specific features of FidelityFX is it using, it could be only one feature, or multiples.

Leave a comment