Gaming Performance

Now we come to the interesting testing we are sure you all can’t wait for, we were also anxious to get to this part and see how it all ends up for gaming performance. In order to test this properly, and really get a good feel for it, and cover all the bases for the big picture, we decided to test 1080p, 1440p, and 4K resolutions, all three resolutions to see from top to bottom how these CPUs affect performance games. For testing, we are using an NVIDIA GeForce RTX 3080 Ti Founders Edition video card. This will ensure we are CPU bound at 1080p, but also some games will still be GPU bound at 4K. Since we are testing all 3 resolutions, it’ll be a good mix to see where the CPU matters and where it doesn’t. Oh, and yes, we are testing the new Battlefield 2042 as well!

Battlefield 2042

Let’s start with the newest game on the block, Battlefield 2042. Now, since this game is a multiplayer game, it is hard to test in a live environment and have consistency. Therefore, we chose to create a private Portal server, and test with no players, it’s the only way we can assure a consistent run-through. We created a custom team deathmatch and a manual run-through in the whole map.

Intel Core i7-12700K vs AMD Ryzen 7 5800X Battlefield 2042 performance graph

This graph may not show what you expected. We are actually seeing a game performance difference at 1080p here, and slightly at 1440p in this game between the CPUs. With the Intel Core i7-12700K at 1080p, we find it performs 9% faster than with the AMD Ryzen 7 5800X. This is actually not an insignificant amount. There is even a smaller difference at 1440p with 2% towards the 12700K. Once we get to 4K in this game, we are GPU bound completely, even with the RTX 3080 Ti.

Cyberpunk 2077

Intel Core i7-12700K vs AMD Ryzen 7 5800X Cyberpunk 2077 performance graph

When you think of Cyberpunk 2077 you typically think this game is going to completely be GPU dependent. However, at 1080p we do find a difference between these CPUs. The new Intel Core i7-12700K is once again faster, by 3% compared to the 5800X. It isn’t much, but if you had an even faster GPU this difference would be even greater. At 1440p and 4K we are GPU dependent, and there isn’t any real-world difference.

Far Cry 6

Intel Core i7-12700K vs AMD Ryzen 7 5800X Far Cry 6 performance graph

Far Cry 6 is another game where wow, we did not expect these much different results. The Intel Core i7-12700K is simply faster in this game, at every resolution, even with the RTX 3080 Ti. At 1080p the 12700K is 22% faster than the Ryzen 7 5800X which is a very major difference. This is nothing to scoff at, this is a real performance difference that can make or break fast refresh rate gaming. Even at 1440p, the 12700K is 15% faster than the 5800X, another significant difference that can make or break fast refresh rate gaming. At 4K is the least difference, but still a 3% difference at any rate.

Microsoft Flight Simulator 2020 Game Of The Year Edition

We have installed the latest huge patch to Microsoft Flight Sim 2020 which brings it up to the new Game of the Year Edition. This new edition has major performance changes, including a new DX12 API mode we are going to use for today’s testing.

Intel Core i7-12700K vs AMD Ryzen 7 5800X Microsoft Flight Simulator 2020 Game of the year Edition performance graph

We were surprised to find no differences in Flight Sim 2020, to be honest. We are running the game in the new DX12 API mode, and it seems here there are no differences at any resolution between the CPUs. It is a new API for this game and is still in Beta, so this could change in the future with further optimizations either way.

Watch Dogs Legion

Intel Core i7-12700K vs AMD Ryzen 7 5800X Watch Dogs Legion performance graph

Watch Dogs Legion is another game where we are seeing a performance difference at 1080p and 1440p between the CPUs. At 1080p the new Intel Core i7-12700K is 20% faster than the AMD Ryzen 7 5800X. This again is a significant difference for a game. Even at 1440p, we find the 12700K to be 5% faster than the 5800X. Only at 4K are they much closer, but the 12700K is still 3% faster.

Crysis Remastered

Intel Core i7-12700K vs AMD Ryzen 7 5800X Crysis Remastered performance graph

In Crysis Remastered we find there to be a little difference at 1080p and 1440p as well. At 1080p the 12700K is 3% faster and at 1440p it is 6% faster.

Don’t Miss Out on More FPS Review Content!

Our weekly newsletter includes a recap of our reviews and a run down of the most popular tech news that we published.

Brent Justice

Brent Justice has been reviewing computer components for 20+ years, educated in the art and method of the computer hardware review he brings experience, knowledge, and hands-on testing with a gamer oriented...

Join the Conversation

17 Comments

  1. If I were doing a fresh build today, I’d probably get a 12600K or 12700K and a DDR4 board. Since I’m already on AM4, I’ll be looking at the next Zen with the 3D cache.

    The 12900K makes no sense, but the mainstream parts look great.

  2. Good writeup. Not exactly shocked that Intel’s brand new part beats on the 2 year old 5800x though. There are price differences too… the 5800x has been on sale a couple times for $299, and compatible ddr4 mobos are cheap. Am4 Upgradeability is also nice. None of that can be said for Intel platform.

    We’ll see what Zen 4 looks like soon, but on the downside new ddr5 mobos are very likely.

    Even though it is outclassed now, I’m still thinking about picking up a 5800x if I can catch a deal… a last hurrah upgrade for my X470 board. I can probably milk that on my prod system for another couple years until prices and availability chill out on ddr5. The 2700x has been a workhorse for several years now.

    OTOH…. if the i7 could dramatically speed up Civilization, I’m curious. That game becomes so much of a cpu beatdown in the late stages. It’s also poorly threaded.

  3. The Intel chip is clearly NOT a direct competitor with the 5800x. This would only be the case if you wilfully ignore the cost of the parts.

    For the same cost as the Intel chip, you can purchase a 5900x, which is therefore kthe direct competitor to this chip. Core count is not relevant, only cost.

    I’m willing to bet this chip does not perform nearly as well against the 5900x.

    Also to use DDR5 in Comparison with DDR4 is just not good practice, again for the prohibitive cost of DDR5 memory and boards. The AMD setup with DDR4 will be MUCH cheaper in comparison with little overall performance difference and therefore AMD would come out on top in realistic review…

  4. The Intel chip is clearly NOT a direct competitor with the 5800x. This would only be the case if you wilfully ignore the cost of the parts.

    For the same cost as the Intel chip, you can purchase a 5900x, which is therefore kthe direct competitor to this chip. Core count is not relevant, only cost.

    I’m willing to bet this chip does not perform nearly as well against the 5900x.

    Also to use DDR5 in Comparison with DDR4 is just not good practice, again for the prohibitive cost of DDR5 memory and boards. The AMD setup with DDR4 will be MUCH cheaper in comparison with little overall performance difference and therefore AMD would come out on top in realistic review…

  5. [QUOTE=”Gunsmoke, post: 45793, member: 5177″]
    The Intel chip is clearly NOT a direct competitor with the 5800x. This would only be the case if you wilfully ignore the cost of the parts.
    For the same cost as the Intel chip, you can purchase a 5900x, which is therefore kthe direct competitor to this chip. Core count is not relevant, only cost.
    I’m willing to bet this chip does not perform nearly as well against the 5900x.
    Also to use DDR5 in Comparison with DDR4 is just not good practice, again for the prohibitive cost of DDR5 memory and boards. The AMD setup with DDR4 will be MUCH cheaper in comparison with little overall performance difference and therefore AMD would come out on top in realistic review…
    [/QUOTE]

    Based on the cost numbers I’m seeing right now, the 5900X’s MSRP is $549 and is listed at NewEgg and BestBuy for that, $569 at B&H. The 5800X’s MSRP is $449, and is listed at that on BestBuy and B&H, but is ~$400 on NewEgg. The 12700K is listed at $404 on NewEgg, $419 on BestBuy and $380 (after a coupon) on B&H. If you’re looking at only the cost of the chip, I’d say we’re pretty spot on.

    With respect to the platform, we did find it did not make a material difference to performance in [URL=’https://www.thefpsreview.com/2021/12/13/intel-core-i5-12600k-alder-lake-ddr4-vs-ddr5-performance/’]a recent article comparing DDR4 to DDR5[/URL]. It’s an interesting debate on which platform to use – in theory, for a CPU review, one _should_ use the platform that its manufacturer intends for it to be in its best light, and that’s with DDR5. The other factor here is that there’s a bit of a gulf between the quality of the DDR4 and DDR5 boards out there – the DDR5 boards have far better power delivery circuitry which can certainly impact the overall performance of a chip.

    Either way, over the next month or so, we’ll be working on covering all combinations of performance write ups, and that will certainly include a comparison to the 5900x for your reading enjoyment.

  6. I’d agree that testing Intel with DDR5 is appropriate: it supports the standard, and to do the testing with anything less might artificially be limiting the platform.

    In this case, it doesn’t look like it makes any difference, but I support the testing methodology. It would have a significant impact on total system cost, but I don’t think that is a point the article is trying to make right now.

  7. My previous system was a 5800x and I just upgraded to a 12700k a month or so ago.

    For my use cases there’s not much difference performance wise but I was having some stability issues recently with the 5800x build. In hindsight it may have been bios related as all the components seem to be working in other machines after the fact and I believe I’d done a bios update around the time I started getting blue screens and whatnot.

    I like the 12700k and have no regrets other than some getting used to windows 11. I’d recommend it to anybody.

    It’s not obvious which cores in task manager are the efficiency cores or what clock speeds the different cores are using. I’m sure there is another app that can tell you but I wish that was available in task manager.

    I suspect the best value will be a 12600k or 12400 and a b or h series motherboard when they come out but I didn’t want wait.

  8. The review was pretty much what I expected.

    12900k vs 5950x

    12700k vs 5900x

    12600k vs 5800x

    The 12700k vs 5800x has a core count advantage on the intel side.

  9. Interesting.

    I knew it would only be a matter of time before Intel would wrest back the performance crown.

    That said, they are doing so using significantly more power at load. Even without any architecture improvements, it would seem like AMD could narrow the gap by just increasing power use to where Intel is. They also have new improvements coming in the next gen, like that new high performing cache, which should be interesting.

    I’m truly hoping we get back to where things were in 1999 through out the early 2000’s, with the two leapfrogging eachother with every release. That could make things fun again!

    For my purposes though, ehilr Intel is pushing much higher numbers in most cases, they are practically insignificant for me, as being a “4k Ultra” kind of guy, I will spend all of my game time GPU limited, and as for general desktop productivity type of stuff, even a dual core Haswell Celeron is fast enough to not be a problem, so all this extra power is mostly wasted on me.

  10. [QUOTE=”Zarathustra, post: 45828, member: 203″]
    That said, they are doing so using significantly more power at load. Even without any architecture improvements, it would seem like AMD could narrow the gap by just increasing power use to where Intel is. They also have new improvements coming in the next gen, like that new high performing cache, which should be interesting.
    [/QUOTE]
    In most scenarios – basically everything except synthetics designed to push all core — it looks like Intel is about the same as AMD on power use.

  11. [QUOTE=”Zarathustra, post: 45828, member: 203″]
    For my purposes though, ehilr Intel is pushing much higher numbers in most cases, they are practically insignificant for me, as being a “4k Ultra” kind of guy, I will spend all of my game time GPU limited
    [/QUOTE]
    While this is generally a non-issue with Zen 3 [I]today[/I], higher 1080p results means that when GPUs get faster, Alder Lake’s performance advantage will become more significant. Same pattern we’ve seen decades over.

    Obviously this only matters for the games you or other prospective buyers would play 😎

  12. [QUOTE=”Zarathustra, post: 45828, member: 203″]
    For my purposes though, ehilr Intel is pushing much higher numbers in most cases, they are practically insignificant for me, as being a “4k Ultra” kind of guy, I will spend all of my game time GPU limited,
    [/QUOTE]
    Except, this isn’t necessarily always going to be the case. I’ve done extensive testing on this and found that at 4K, CPU’s like the 2920X showed the same average FPS as a 9900K lending credence to the whole GPU thing, but playing games on the former wasn’t smooth. When I dug into it I found that in a lot of games, the minimum frame rates on certain CPU’s were horrendous and thus, not capable of delivering a good gaming experience regardless of the GPU.

    Here is an example from my [URL=’https://www.thefpsreview.com/2019/08/02/msi-meg-x570-godlike-motherboard-review/’]review[/URL] of the X570 GODLIKE motherboard. In fact, I wrote about this issue in detail [URL=’https://www.thefpsreview.com/2019/08/02/msi-meg-x570-godlike-motherboard-review/12/#cmtoc_anchor_id_0′]here.[/URL]

    [ATTACH type=”full” alt=”1642489389714.png”]1420[/ATTACH]

    The average frame rates would tell you that these perform the same, but if you look the minimum frame rates are almost half what they were on the Intel system. Now, the Ryzen 5000 series has stronger gaming performance and even the 3000 series issues with Destiny 2 were fixed, but initially the 3000 series required a work around of sorts to even run on those CPU’s. This is why it wasn’t tested during the initial CPU review.

    [ATTACH type=”full” alt=”1642489463342.png”]1421[/ATTACH]

    As you can see the averages are all pretty close to each other, but as you can see above, averages don’t tell the whole story. What I didn’t graph out were the minimums for all of the systems, simply opting to showcase the X570/3900X and the 9900K. However, the 2920X at stock speeds only achieved a frame rate of 26FPS minimum and 36FPS minimum when PBO was enabled. While the averages look good, I can assure you that it was not the best experience as I saw routine drops into the 45FPS range whenever things would happen on screen.

    Destiny 2 is admittedly a bit of an outlier, but there are plenty of others too. The moral of the story is that while you are primarily GPU limited at 4K, your CPU does indeed play a role and your minimum FPS and thus, your actual experience can be marred by a sub-par CPU.

  13. [QUOTE=”Dan_D, post: 47054, member: 6″]
    Except, this isn’t necessarily always going to be the case. I’ve done extensive testing on this and found that at 4K, CPU’s like the 2920X showed the same average FPS as a 9900K lending credence to the whole GPU thing, but playing games on the former wasn’t smooth. When I dug into it I found that in a lot of games, the minimum frame rates on certain CPU’s were horrendous and thus, not capable of delivering a good gaming experience regardless of the GPU.

    Here is an example from my [URL=’https://www.thefpsreview.com/2019/08/02/msi-meg-x570-godlike-motherboard-review/’]review[/URL] of the X570 GODLIKE motherboard. In fact, I wrote about this issue in detail [URL=’https://www.thefpsreview.com/2019/08/02/msi-meg-x570-godlike-motherboard-review/12/#cmtoc_anchor_id_0′]here.[/URL]

    [ATTACH=full]1420[/ATTACH]

    The average frame rates would tell you that these perform the same, but if you look the minimum frame rates are almost half what they were on the Intel system. Now, the Ryzen 5000 series has stronger gaming performance and even the 3000 series issues with Destiny 2 were fixed, but initially the 3000 series required a work around of sorts to even run on those CPU’s. This is why it wasn’t tested during the initial CPU review.

    [ATTACH=full]1421[/ATTACH]

    As you can see the averages are all pretty close to each other, but as you can see above, averages don’t tell the whole story. What I didn’t graph out were the minimums for all of the systems, simply opting to showcase the X570/3900X and the 9900K. However, the 2920X at stock speeds only achieved a frame rate of 26FPS minimum and 36FPS minimum when PBO was enabled. While the averages look good, I can assure you that it was not the best experience as I saw routine drops into the 45FPS range whenever things would happen on screen.

    Destiny 2 is admittedly a bit of an outlier, but there are plenty of others too. The moral of the story is that while you are primarily GPU limited at 4K, your CPU does indeed play a role and your minimum FPS and thus, your actual experience can be marred by a sub-par CPU.
    [/QUOTE]

    I’m with you, I usually judge my performance not by average framerate, but by minimum framerate, as that shows the worst case.

    I haven’t seen any bad 1% framerates yet, but who knows, you have to keep up with the times, and at some point I’ll need a new CPU again, but until then, I’m fine 🙂

    I’ve always been a “buy what you need NOW” kind of guy. It makes no sense to try to predict to the future, which system will have th elongest longevity, etc. etc., because the early adopter penalty for something that has long staying power is usually super expensive, and you can never really predict the future.

    For right now, my system does the trick. If it doesn’t tomorrow, I’ll cross that bridge when I get to it.

    Future-proofing in tech is a fools errand.

  14. 1% lows at a very minimum and 0.1% lows if you want to be pedantic, and those if you’re not going to go all out and start digging into frametimes.

    I’d much prefer frametime analysis myself…

  15. Great review, kudo’s for MSI providing a good test sample. I would say it would be definitely Intel build if building new at this time, the 12700K smashes the 5800x into oblivion on this review and is very nice seeing Intel getting serious with a great comeback from where they were at.

    As for Zen 4, it also comes with a new AMD platform. Just think of all past AMD new platform changes and how smoothly (cough cough) or not they went. I don’t remember one AMD major platform change that went smooth. Maybe TRX40 was their best but it was a minor update. X370, B350 had over a year of issues, brick boards etc. Intel for the most part new platform changes go rather smooth out of the gate. I would definitely wait and see with Zen 4 with AM5 and most likely go for attempt 2 for AMD if they are competitive or better overall. Right now Intel I think in general is the better choice for the desktop.

    AMD VCache, lol, one CPU incoming -> That to me is not a serious attempt to maintain performance leadership and with a frequency cut to boot. A 5900x/5950x Vcache versions I would have expected right out of the gate as well. Its maybe more due to AMD being limited by how much their products can be manufactured, manufactured limited, stunting their growth other than their price growth. At current AMD CPU pricing, I would not recommend any of their desktop CPUs over Intel except maybe in specific use cases. Not only that, AMD has virtually adbandom their HEDT CPUs with Zen 2 unless you go Pro and now can get a over 1 year old design Zen 3.

    Anyways the 12700k looks to kicks AMD ass at this time, while if pertinent to someone that is a case by case condition. For me the 5800x and 3080Ti is very much proficient for the 4K Bigscreen in the playroom and probably for the next several years. The 3900x good enough for the 6900XT and 4K. The Zen2 3060x and 3090, I could use an upgrade maybe.

Leave a comment