NVIDIA GeForce RTX 3070 Founders Edition Video Card Prominent on Black Background

Introduction

On September 16th, 2020 NVIDIA launched the new GeForce RTX 3080 Founders Edition video card based on the next-gen Ampere architecture.  In fact, this was also the announcement of the GeForce RTX 30 Series itself, with two cards being announced alongside the launch.  The GeForce RTX 3090, and GeForce RTX 3070.  The Ampere architecture is what powers the GeForce RTX 30 Series.

The GeForce RTX 3080 Founders Edition launched at an MSRP of $699 and replaces the GeForce RTX 2080 and GeForce RTX 2080 SUPER.  Those video cards also launched at $699 when they were released in 2018 and 2019 respectively. 

In our review, we found the GeForce RTX 3080 Founders Edition provided an over 50% performance bump over the GeForce RTX 2080 FE at 1440p and an even bigger 70% advantage at 4K.  We also found the GeForce RTX 3080 provided a performance bump over the GeForce RTX 2080 Ti FE (NVIDIA’s previous flagship) by over 20% at 1440p and over 25% at 4K.  This means faster than GeForce RTX 2080 Ti FE performance (a card that is $1200) for now $699 with the GeForce RTX 3080 Founders Edition, saving you $500 and giving you more performance.

Enter the RTX 3070

Now we enter NVIDIA’s next entrant into the GeForce RTX 30 Series lineup, the GeForce RTX 3070 Founders Edition video card.  This is the one most of you have been waiting to see about.  At an MSRP of $499, it is at a much more appealing, and reachable price point for gamers.  Note that add-in-board partner custom video cards will be available on October 29th.

The question on everyone’s mind is if this beast is going to provide faster than GeForce RTX 2080 Ti Founders Edition performance, for a much lesser price.  Prior to its launch, the rumor mill was churning indicating that this video card would offer faster than GeForce RTX 2080 Ti FE’like performance.  We will find out if that is true or not.

We are not going to dive into the architecture explanation in this review.  If you want to see what makes the new next-gen Ampere architecture tick, and what it offers, please read our GeForce RTX 3080 Founders Edition review.  We do suggest you do this before reading this review, as it goes over the advantages in new RT Core performance, Tensor Core performance, and extra FP32 power in the CUDA Cores.  Those advancements are at the heart of the GeForce RTX 3070 as well.  In addition, there are new technologies to look at like NVIDIA Reflex Low Latency Technology, RTX IO, and NVIDIA Broadcast.

GeForce RTX 3070 Founders Edition Specs

GeForce RTX 3070 FEGeForce RTX 2070 FEGeForce RTX 2070 SUPER FEGeForce RTX 2080 Ti FE
Architecture ProcessAmpere 8nmTuring 12nm FinFETTuring 12nm FinFETTuring 12nm FinFET
CUDA Cores5888230425604352
Tensor Cores184288320544
RT Cores46364068
Texture Units184144160272
ROPs96646488
GPU Boost1725MHz1710MHz1770MHz1545MHz
Memory Clock8GB 14GHz GDDR6 448GB/s8GB 14GHz GDDR6 448GB/s8GB 14GHz GDDR6 448GB/s11GB 14GHz GDDR6 616GB/s
TGP220W185W215W250W
MSRP$499$499/$599 FE$499$1199

The GeForce RTX 3070 Founders Edition has an MSRP of $499.  It is based on the Samsung 8nm manufacturing process, and Ampere architecture.  It has 46 SMs, 5,888 CUDA Cores, 184 Tensor Cores (3rd gen), 46 RT Cores (2nd gen), 184 Texture Units, and 96 ROPs.  It has a GPU Boost Clock of 1725MHz.  It has 8GB of GDDR6 memory at 14GHz on a 256-bit bus providing 448GB/s of memory bandwidth.  TGP is 220 Watts.

Price Comparison

When we look at what the price comparison from the last generation would be, it is actually the GeForce RTX 2070, but not the Founders Edition.  You see, when the GeForce RTX 2070 launched in 2018 NVIDIA actually had two variants of it.  There was the regular, or base GeForce RTX 2070 video card which had an MSRP of $499.  It had a GPU Boost Clock of 1620MHz.  Then, NVIDIA sold its own variant of the GeForce RTX 2070 called the GeForce RTX 2070 Founders Edition for $599.  The Founders Edition had a higher 1710MHz GPU Boost Clock, but that was the only difference. 

Therefore, the Founders Edition had slightly higher performance, but a $100 higher price.  However, with add-in-board manufacturers customizing cards, their factory boost clocks rose to or surpassed the Founders Edition boost clock anyway.  Those video cards started with the base price and reference design of $499 and went up from there.  Then came the GeForce RTX 2070 SUPER in 2019 and it cleared up this mess and debuted with an MSRP of $499 as well.  However, it was a much faster video card than the original RTX 2070/Founders Edition. 

Therefore, the appropriate price comparison to the last generation is the GeForce RTX 2070, either the base one or the Founders Edition.  And, the GeForce RTX 2070 SUPER Founders Edition is also a comparison point, since it also debuted at $499 in 2019.  Therefore, we must include in this review both the GeForce RTX 2070 Founders Edition and the GeForce RTX 2070 SUPER Founders Edition since both debuted at $499.

RTX 2070 and SUPER Comparison Specs

The GeForce RTX 2070 FE, by comparison, had 36 SMs, 2,304 CUDA Cores, 288 Tensor Cores (2nd gen), 36 RT Cores (1st gen), 144 Texture Units, and 64 ROPs.  It was clocked at 1710MHz GPU Boost Clock (Founders Edition).  Interestingly, it also had 8GB of GDDR6 at 14GHz on a 256-bit bus offering 448GB/s.  The TGP was 185W.  Launched at $499 for the base and $599 for FE. 

The GeForce RTX 2070 SUPER FE by comparison, had 40 SMs, 2,560 CUDA Cores, 320 Tensor Cores (2nd gen), 40 RT Cores (1st gen), 160 Texture Units, and 64 ROPs.  It was clocked at 1770MHz GPU Boost Clock.  It also had 8GB of GDDR6 at 14GHz on a 256-but bus offering 448GB/s of bandwidth.  The TGP was 215W.  Launched at $499.

Recent Posts

Brent Justice

Brent Justice has been reviewing computer components for 20+ years, educated in the art and method of the computer hardware review he brings experience, knowledge, and hands-on testing with a gamer oriented...

Join the Conversation

38 Comments

  1. A little disappointing after the hype. I was expecting it to be close to the 2080 Ti in rasterization, but be faster in ray tracing. I suspect that it is actually the lower memory bandwidth that is hurting the card here rather than memory capacity.
  2. Thank you Brent for the awesome review! This card checks off a lot of the boxes I’d want in a card, with exception for 1. VRAM. I will be making the move to 4k monitors this fall, and I don’t believe 8GB of VRAM is going to be enough to last for the long haul. While the FPS #’s shown for this card are most definitely "usable", I fear the longevity of the solution won’t be there.
  3. A little disappointing after the hype. I was expecting it to be close to the 2080 Ti in rasterization, but be faster in ray tracing. I suspect that it is actually the lower memory bandwidth that is hurting the card here rather than memory capacity.

    Maybe, but a nice solid upgrade for 2070 and 5700 users.

  4. I agree 100% with the conclusión. As a gtx 1070Ti owner this is the card I should ve looking for… Except this could all change by tomorrow
  5. Wow…this is looking like the best current option for 1080p or 1440p with high refresh.

    Going to feel like a true competitive market again once the Radeon 6000 series drops. Well, depending on their price points.

    Thanks for the amazing review!

  6. Thank you Brent for the awesome review! This card checks off a lot of the boxes I’d want in a card, with exception for 1. VRAM. I will be making the move to 4k monitors this fall, and I don’t believe 8GB of VRAM is going to be enough to last for the long haul. While the FPS #’s shown for this card are most definitely "usable", I fear the longevity of the solution won’t be there.

    Given how close the 3070 seems to be to the RTX 2080 Ti, let me tell you. The card is NOT a 4K gaming card to begin with. I’ve used the RTX 2080 Ti extensively at 4K and its great for some titles, but many of them perform at frame rates I’d consider sub par for it. You sometimes end up having to turn settings down if you want to maintain 60FPS. At the very least, I often had to ditch anti-aliasing or drastically reduce it. While average frame rates are usually sufficient, minimums can fall into the 45FPS range or sometimes worse.

    In 8 of the games Brent tested at 4K, neither the RTX 2080 Ti or the 3070 could achieve a 60FPS average. That would mean that the minimums would likely be in the toilet. I can tell you from extensive testing on the RTX 2080 Ti and RTX 2080 Super cards, that they are. These cards can sometimes do OK at 4K, but generally, they are NOT 4K cards. You need an RTX 3080 if you want a good experience at 4K without having to turn games down to potato mode.

  7. Given how close the 3070 seems to be to the RTX 2080 Ti, let me tell you. The card is NOT a 4K gaming card to begin with. I’ve used the RTX 2080 Ti extensively at 4K and its great for some titles, but many of them perform at frame rates I’d consider sub par for it. You sometimes end up having to turn settings down if you want to maintain 60FPS. At the very least, I often had to ditch anti-aliasing or drastically reduce it. While average frame rates are usually sufficient, minimums can fall into the 45FPS range or sometimes worse.

    In 8 of the games Brent tested at 4K, neither the RTX 2080 Ti or the 3070 could achieve a 60FPS average. That would mean that the minimums would likely be in the toilet. I can tell you from extensive testing on the RTX 2080 Ti and RTX 2080 Super cards, that they are. These cards can sometimes do OK at 4K, but generally, they are NOT 4K cards. You need an RTX 3080 if you want a good experience at 4K without having to turn games down to potato mode.

    to think the RTX2080Ti WAS THE 4K gaming card just last year. Heck even the RTX2080 Super was a 4K capable card on plenty of games.

    On a sidenote, DLSS could be a game changer when going 4K, specially with its new modes. Time will tell…

  8. Given how close the 3070 seems to be to the RTX 2080 Ti, let me tell you. The card is NOT a 4K gaming card to begin with. I’ve used the RTX 2080 Ti extensively at 4K and its great for some titles, but many of them perform at frame rates I’d consider sub par for it. You sometimes end up having to turn settings down if you want to maintain 60FPS. At the very least, I often had to ditch anti-aliasing or drastically reduce it. While average frame rates are usually sufficient, minimums can fall into the 45FPS range or sometimes worse.

    In 8 of the games Brent tested at 4K, neither the RTX 2080 Ti or the 3070 could achieve a 60FPS average. That would mean that the minimums would likely be in the toilet. I can tell you from extensive testing on the RTX 2080 Ti and RTX 2080 Super cards, that they are. These cards can sometimes do OK at 4K, but generally, they are NOT 4K cards. You need an RTX 3080 if you want a good experience at 4K without having to turn games down to potato mode.

    I’ve only had a few games drop below 50 FPS with the 2080 Ti. Funnily enough, I was playing through Dying Light again to experience all the DLC I missed and it was one of the games that did this, so I had to drop the draw distance. That was the only thing I changed. Average framerate was still in the 70s prior to dropping the draw distance. The 3070 is solidly a 1440p card, but it is a fine entry-level 4K card going by the review and based on relative 2080 Ti performance.

  9. to think the RTX2080Ti WAS THE 4K gaming card just last year. Heck even the RTX2080 Super was a 4K capable card on plenty of games.

    On a sidenote, DLSS could be a game changer when going 4K, specially with its new modes. Time will tell…

    No, the RTX 2080 Ti was the fastest gaming card on the market last year. It was "the 4K gaming card" by default as there was nothing better. That’s the only reason it was "the best" at 4K. That didn’t mean it was good at 4K. The data speaks for itself on that. It’s really not. 3840×2160 is still too demanding for most graphics cards. The RTX 3080 is really the first graphics card that has enough performance to provide a solid gaming experience at that resolution.

    I used the RTX 2080 Ti at 4K for a year. The experience was lacking in several games. I also have an RTX 2080 Super on the test bench. An overclocked one at that. I do a lot of 4K testing with it. It’s certainly capable of 4K gaming in some titles, but it leaves even more to be desired than the 2080 Ti does. The data is in the review. The RTX 2080 Ti and 3070 failed to achieve a 60FPS minimum in 8 out of 10 games. Many were 50FPS, but one was as low as 33FPS. Some were in the 40FPS range. This isn’t up to par by most standards unless you have a slideshow fetish.

    Hitman 2 is a good example of what I’m talking about. In this game, the RTX 2080 Super can’t hit an average of 60FPS on any of the test configurations. It’s close, though minimums are basically awful.

    View attachment 592

    View attachment 591

    Ghost Recon Breakpoint is another great example of what I’m talking about. This game is entirely GPU limited. The RTX 2080 Super cannot achieve a 60FPS average frame rate while maxing the game out. The settings used here are "VERY HIGH". Not "Ultra" and **** sure not "Ultimate." I played this game on my RTX 2080 Ti on Ultimate settings and it can average 60FPS as was Brent’s conclusion. However, the minimums do drop into the 45FPS range enough that you notice it. At least on DirectX 12. Vulkan rendering wasn’t available when I played the game at 4K on my RTX 2080 Ti. Keep in mind the graph above references the RTX 2080 Super at "Very High" using Vulkan. In DirectX 12 mode things are even worse as there is roughly a 20% drop in performance vs. Vulkan.

    Again, the RTX 2080 Super and RTX 2080 Ti (and by extension, the 3070 Super) are not good cards for 4K gaming. The data is clear on this point. That said, they can do a decent job of it in certain specific situations. Shadow of the Tomb Raider actually performs quite well on the RTX 2080 Super at 4K. However, it only does so without ray tracing enabled. Obviously, the RTX 2080 Ti would certainly be fast enough in that situation. I haven’t tried ray tracing at 4K, but at 3440×1440, it was surprisingly good. But that only proves the point. In some titles, its’ fine. In others, not so much.

    When you start using things like ray tracing that performance gets even worse. If you are the kind of gamer that doesn’t mind turning settings down to get better frame rates and you can live with that, go on and get a 3070. But, as someone with a 4K display who has used the RTX 2080 Super and RTX 2080 Ti extensively at 4K, I’m telling you (and so is the data) that those cards are NOT suited to it. In newer titles they struggle. That’s not going to get better over time. The 3070 has less VRAM and less memory bandwidth than the RTX 2080 Ti does as well. That doesn’t bode well for the long term aging of the card at 4K.

    The thing is, most people that shoot for 4K gaming tend to be into the fidelity of it. They usually aren’t the crowd that wants to turn settings down. As a result, the 3070 isn’t really or shouldn’t be their target graphics card. Let’s also not kid ourselves. 60Hz sucks ***. What you really want is at least 3840×2160@120Hz minimum. Nothing short of an RTX 3080 or RTX 3090 is remotely capable of achieving that. 4K is very much a "pay to play" resolution. Just because last years card was the best you could do, doesn’t mean that it was great at the task.

    To be clear, I think the RTX 3070 is fantastic. It delivers almost identical performance to that of the RTX 2080 Ti at less than half the price. That’s amazing. However, that does not make it ideal for 4K. Anytime you have to start turning settings down at a given resolution for several games, its clear that card isn’t well suited to playing those games at that resolution. Again, 8 out of 10 games in Brent’s review couldn’t hit 60FPS average using an RTX 2080 Ti or RTX 3070.

  10. I think the review was nice.
    I don’t like the mindset that says this card is "high end" or "middle-end" or "low-end".
    This is not a high-end card. It cannot push 4K with all the top choices clicked off.
    Look it barely runs Metro Exodus with RT enabled.
    Much more RT intense games are coming.
    This card was judged at 1440 which is not where the "high-end" lives anymore.
    I use 1440 all the time because I can’t see plunking down money for 4K……yet and I don’t game that much anymore. But when I do, I want 1440 with all the buttons pushed. I think the 3080 will give me that AND RT, if there is ever one to buy.
    If you are gaming at 1080 and want all the stuff checked, this is your card, and at a great price.
    I am a little disappointed, I thought this was going to be the 1440 card for everything and RT, but looks like a bust in that regard.
  11. to think the RTX2080Ti WAS THE 4K gaming card just last year. Heck even the RTX2080 Super was a 4K capable card on plenty of games.

    On a sidenote, DLSS could be a game changer when going 4K, specially with its new modes. Time will tell…

    In my mind, "4k gaming" is an ambiguous term that’s being tossed around too much. You don’t need 3080 to play Diablo 3 in 4k, but obviously, the same cannot be said if you want to play Control. "4k gaming" really only applies to a small number of latest triple A games, or some poorly coded crap that I’m sure is out there.

    Myself, I play a lot of older games or small indy titles that rival solitaire with its graphics… and I play them all in 4k with max settings without having the need for 3080.. or even 2080Ti for that matter. YMMV.

  12. A little disappointing after the hype. I was expecting it to be close to the 2080 Ti in rasterization, but be faster in ray tracing. I suspect that it is actually the lower memory bandwidth that is hurting the card here rather than memory capacity.

    Looking at OC reviews, increased bandwidth doesn’t increase performance much, if at all.

    Actually I’m really dissapointed with OC figures, I expected much better results, since the card pulls 100w less than the rtx3080.

  13. In my mind, "4k gaming" is an ambiguous term that’s being tossed around too much. You don’t need 3080 to play Diablo 3 in 4k, but obviously, the same cannot be said if you want to play Control. "4k gaming" really only applies to a small number of latest triple A games, or some poorly coded crap that I’m sure is out there.

    Myself, I play a lot of older games or small indy titles that rival solitaire with its graphics… and I play them all in 4k with max settings without having the need for 3080.. or even 2080Ti for that matter. YMMV.

    Amen

    A lot of games out there that will run great at 4K with the 3070.
    Not everyone plays AAA titles only, just like not everybody cares for RT.
    I think the 3070 is gonna make a nice upgrade for a lot of ppl :)

    I logged 600+ hrs in NMS in 4K with a 2070, Played the **** outta a bunch of WH40K titles etc.
    Not to mention a slew of DLSS enabled stuff.

    I’ll probably go 3080 or 3090 at this stage, but I could easily keep going with a 3070 at 4K60 at everything I play atm.
    But I’m curious about 4K144, so that changes things for me.

  14. Awesome review Brent!!!!!!!!!

    As others have said, I’m not all that hyped about the card other than the fact that it pretty much exchanges blows with the 2080Ti which is pretty impressive.
    One can certainly expect to see the prices of new and used 2080Ti’s to drop drastically in price in the coming months!

  15. People complain that the 3070 only has 8GB vs 2080Ti 11gb, but the performance difference even at 4K doesn’t seem to reflect this.

    If the RTX3070 was memory starved the performance hit would be so much bigger.

    I recall when the FuryX came out, people complained about it having only 4GB of ram, but it really didn’t hurt performance until a few years later.
    In contrast VegaVII never really took advantage of its 16gb of ram, until FS2020 came out @4K and still its only slightly faster than the GTX1080Ti.

  16. Great review. Regarding how to tier cards, shouldn’t it be by volume? i.e. bulk of cards sold is at $xxx.xx that would define one tier, then based on that you define the rest. Then you have tiering based on use – blender/creative work/AI with large RAM requirements, etc.
  17. I’ve always roughly grouped GPUs by price as well. I don’t know if low/med/hi really matters, but generally <$150, 150-200, 180-250, 250-350, 350-500, 500+ are more or less the buckets I’ve always kinda seen
  18. Times were much simpler a few years ago. You had the top end card and a cut down version, then a couple of midrange cards, a mainstream card and a value card.

    Now there are too many choices.

    Once we went over $500 dlls for a videocard thing went downhill

  19. Times were much simpler a few years ago. You had the top end card and a cut down version, then a couple of midrange cards, a mainstream card and a value card.

    Now there are too many choices.

    Once we went over $500 dlls for a videocard thing went downhill

    How else would a bubble build?

  20. I’ve always roughly grouped GPUs by price as well. I don’t know if low/med/hi really matters, but generally <$150, 150-200, 180-250, 250-350, 350-500, 500+ are more or less the buckets I’ve always kinda seen

    That’s pretty much how I see it, with the 3070 hitting the upper end of upper mid-range.

  21. I managed to order one of these (somehow) from the Microcenter web store earlier this morning, well after Best Buy and Newegg were out of stock. I am guessing my order will get cancelled. There is no way Microcenter got that much FE inventory.

    I was literally one minute late signing in to my work computer, and I missed out on the Best Buy inventory. Figures.

  22. I "managed" to buy an MSI one, only to find out I was the 4th one to order one and they only got 2, expected delivery time now is approx 1 month (could be faster but they are very cautious with estimated dates) so if that is improved stock, I wonder how many 3080’s they got.

    I seen a shop that is expecting PNY 3070’s but they are listed at 999€, that’s nut’s.

    I also almost went for a Gigabyte 3080 that was in stock for a while but at 1099€ that was too much they also had a 3090 of the same model that was listed at 1999€ (and has been sold in the meantime too)

  23. I find really odd that of all the reviews I saw, NONE of them recommended waiting for AMD. Many of them actually recommended just to go buy the RTX3070 right away giving it awards and such.

    For example Linus said in its RTX3070 review to go and buy one before they went out of stock BEFORE starting the review.
    Fast forward to yesterday and now he recommends waiting for AMDs cards release to make up your mind.

    So now everyone is recommending to just wait for the 6800/6800XT.

  24. I find really odd that of all the reviews I saw, NONE of them recommended waiting for AMD. Many of them actually recommended just to go buy the RTX3070 right away giving it awards and such.

    For example Linus said in its RTX3070 review to go and buy one before they went out of stock BEFORE starting the review.
    Fast forward to yesterday and now he recommends waiting for AMDs cards release to make up your mind.

    So now everyone is recommending to just wait for the 6800/6800XT.

    Those video’s may have been made for the old launch and delayed and they could not be assed to edit them? I don’t know, did not watch any yet.

  25. Those video’s may have been made for the old launch and delayed and they could not be assed to edit them? I don’t know, did not watch any yet.

    Could be, but AFAIK reviewers get their cards just a few days before launch, and by then pretty much everyone knew about AMD potential performance, so a wait and see approach could at least been suggested.

  26. As a reviewer, you really can’t opine on something that you don’t know anything about. At the time of the review embargo lifting, the RX series was nothing but a promised presentation and no press guidance given as to expectations. You guys would be looking at him like he had three eyes and a cone shaped head if the 3070 FE conclusion was to wait for AMD to announce something that may not even be price/performance competitive to it.

    That being said, the new information (that has not been independently validated) provided by AMD the next day makes it look like something that could be worth waiting for to compare to the 3070 FE. Therefore, it makes sense to change the opinion on waiting, though, with the current market availability, it’s not like it’s going to be a choice for most everyone.

  27. As a reviewer, you really can’t opine on something that you don’t know anything about. At the time of the review embargo lifting, the RX series was nothing but a promised presentation and no press guidance given as to expectations. You guys would be looking at him like he had three eyes and a cone shaped head if the 3070 FE conclusion was to wait for AMD to announce something that may not even be price/performance competitive to it.

    That being said, the new information (that has not been independently validated) provided by AMD the next day makes it look like something that could be worth waiting for to compare to the 3070 FE. Therefore, it makes sense to change the opinion on waiting, though, with the current market availability, it’s not like it’s going to be a choice for most everyone.

    Thing for me was that with so many sites recommending the RTX3070, it sounded like they knew something we didn’t like AMD wouldn’t have something to counter it. It ended up that they didn’t know what AMD had in its bag of tricks.

    Most of these sites are reputable ones so I expected them to know better.

    Anyway, I don’t know what I’m complaining about since its been a really long time since we had real competition. In the end Gamers win.

  28. Thing for me was that with so many sites recommending the RTX3070, it sounded like they knew something we didn’t like AMD wouldn’t have something to counter it. It ended up that they didn’t know what AMD had in its bag of tricks.

    Most of these sites are reputable ones so I expected them to know better.

    Anyway, I don’t know what I’m complaining about since its been a really long time since we had real competition. In the end Gamers win.

    While we have certainly published leaks in the news section over the past month that have been all over the board with what RX 6000 may or may not be, we found out what AMD had to offer at the same time as everyone else during that live stream. Sure, we can all read the rumors that AMD might have something interesting to share, at the time the 3070 reviews were published, that’s all there was.

    I am looking forward to the competition though.

  29. I think part of the problem is that with ~some~ reviewers… everyone gets an Award! So you can’t really tell what that means anymore. Winning some random award is just a subjective opinion with a special graphic tagged at the end of the review.

    It’s not like the hardware all has a deathmatch, all may enter, one may leave .. some sites nearly every single GPU or CPU reviewed gets a Silver or Gold star, so long as the company is providing the review samples.

    There should be some objective qualifications for an award. I don’t know what… maybe every category is something different… but something. so that the awards at least have some objective meaning… And then if every item wins an award, you know it’s because they are at least meeting some minimum bar.

    I always liked your guys PSU reviews, because winning an award there meant it passed a pretty grueling torture test. Even if it didn’t win a Gold, it still meant it was a good PSU. Wish we could bring something like that to GPUs and CPUs, but honestly I don’t know what the metrics would be that would keep it from being entirely subjective.

  30. I think part of the problem is that with ~some~ reviewers… everyone gets an Award! So you can’t really tell what that means anymore. Winning some random award is just a subjective opinion with a special graphic tagged at the end of the review.

    It’s not like the hardware all has a deathmatch, all may enter, one may leave .. some sites nearly every single GPU or CPU reviewed gets a Silver or Gold star, so long as the company is providing the review samples.

    There should be some objective qualifications for an award. I don’t know what… maybe every category is something different… but something. so that the awards at least have some objective meaning… And then if every item wins an award, you know it’s because they are at least meeting some minimum bar.

    I always liked your guys PSU reviews, because winning an award there meant it passed a pretty grueling torture test. Even if it didn’t win a Gold, it still meant it was a good PSU. Wish we could bring something like that to GPUs and CPUs, but honestly I don’t know what the metrics would be that would keep it from being entirely subjective.

    I agree with you completely when it comes to some sites and their award tendencies. I know HardOCP used to be VERY strict with their awarding of awards at the end of reviews, and being as this site has most of the old staff here now I believe the same tough standards are being met today.

    I have my eye on what I’d like in a card, and I’ll wait for this place to validate my thoughts. All I can do is wait for AMD’s release and inevitable proper testing being done.

  31. I "managed" to buy an MSI one, only to find out I was the 4th one to order one and they only got 2, expected delivery time now is approx 1 month (could be faster but they are very cautious with estimated dates) so if that is improved stock, I wonder how many 3080’s they got.

    I seen a shop that is expecting PNY 3070’s but they are listed at 999€, that’s nut’s.

    I also almost went for a Gigabyte 3080 that was in stock for a while but at 1099€ that was too much they also had a 3090 of the same model that was listed at 1999€ (and has been sold in the meantime too)

    So a little update, I got a call from the store today saying they been in contact with MSI but have no idea when more will come in, they did however get a couple zotacs 3070’s in that were available so I gave the ok to send me that one instead (also letting me know that if I wanted ti stay with my original order I might not get my card this year.

    Sidenote while talking to the employee I mentioned I would have preffered an Asus TUF originally after which I was told they already have 400+ sold as preorder, might be some cancellations there once AMD launches their cards.

  32. I always liked your guys PSU reviews, because winning an award there meant it passed a pretty grueling torture test. Even if it didn’t win a Gold, it still meant it was a good PSU. Wish we could bring something like that to GPUs and CPUs, but honestly I don’t know what the metrics would be that would keep it from being entirely subjective.

    I think that the problem stems from a comprehensive rating not really being a linear, unidimensional thing, but on the other hand, that’s exactly what people want to see, from casual buyers to marketers.

    Like, is it a good product? Is it a good product for specific common uses? Uncommon uses? Is it a good product compared to other products in its class? Is it a good value?

    A power supply …not exploding… that’s given a ‘Pass!’ rating is fairly straightforward, of course. If it’s somewhat above average, maybe that’s a Bronze, and then other ‘medals’ follow with a better showing.

    Now do that with motherboards :).

  33. I agree with you completely when it comes to some sites and their award tendencies. I know HardOCP used to be VERY strict with their awarding of awards at the end of reviews, and being as this site has most of the old staff here now I believe the same tough standards are being met today.

    I have my eye on what I’d like in a card, and I’ll wait for this place to validate my thoughts. All I can do is wait for AMD’s release and inevitable proper testing being done.

    I can’t speak for everyone, but for me and the GPU reviews I did at [H] the reward was subjective, I was free to put one as I desired, and I’m using the same internal criteria for awards that I used there, here.

  34. I can’t speak for everyone, but for me and the GPU reviews I did at [H] the reward was subjective, I was free to put one as I desired, and I’m using the same internal criteria for awards that I used there, here.

    Yeah, that’s why I mentioned the PSU awards – they had very objective measures.

    That being said, I don’t really know what measures you would have for GPUs, so this isn’t like a scathing rebuke of your methods or anything. And in GPUs, you can have several different SKUs of the same GPU… so how do you differentiate an award given to the GPU architecture versus one given to the AIB for a good implementation of a particular SKU?

    So I don’t come bearing any solutions better than what your already doing, other than to point out the awards are pretty obviously subjective, but fortunately, you have earned my trust in judgement on them.. whereas I can’t say that about other sites. I’d just feel warm and fuzzy if there were something more subjective, but I don’t have a good suggestion for how exactly that should look.

Leave a comment