Image: AMD

Peruvian hardware site XanxoGaming has returned with a new set of benchmarks that provide greater insight on how AMD’s first Ryzen CPU with 3D V-Cache technology might handle some of today’s most popular titles. Borderlands 3, Control, and Death Stranding are among the titles included in the tests, which were performed on systems equipped with a GeForce RTX 3080 Ti Founders Edition graphics card (full specs of both AMD and Intel rigs can be found below). The benchmarks seem to suggest that the Ryzen 7 5800X3D generally lines up pretty closely with Intel’s standard Alder Lake-S flagship, the Core i9-12900K. AMD’s 8C/16T Ryzen 7 5800X3D with 3D V-Cache technology will be available globally beginning next Wednesday, April 20, at an MSRP of $449.

AMD Test System (For Gaming)

  • CPU: AMD Ryzen 7 5800X3D
  • Motherboard: X570 AORUS MASTER Rev 1.2 (BIOS F36c)
  • RAM: G.Skill FlareX 4x8GB 3200 MHz CL14 (Samsung B-Die)
  • Graphics Card: NVIDIA GeForce RTX 3080 Ti Founders Edition
  • SSD: Samsung 980 PRO 1TB
  • SSD #2: Silicon Power A55 2TB
  • AIO: Arctic Liquid Freezer II 360
  • PSU: EVGA SUPERNOVA 750W P2
  • OS: Windows 10 Home 21H2

Intel Test System (For Gaming)

  • CPU: Intel Core i9-12900KF (Power Limits Unlimited, no MCE)
  • Motherboard: TUF GAMING Z690-PLUS WIFI D4 (BIOS 1304)
  • RAM: G.Skill FlareX 4x8GB 3200 MHz CL14 (Samsung B-Die)
  • Graphics Card: NVIDIA GeForce RTX 3080 Ti Founders Edition
  • SSD: TeamGroup Delta MAX 250GB
  • SSD #2: Silicon Power A55 2TB
  • AIO: Lian Li Galahad 360
  • PSU EVGA SUPERNOVA 750W P2
  • OS: Windows 10 Home 21H2 (Win Game Mode On, RSB On, HAGS OFF). We rather test with Windows 10 for now.

Summary – Gaming benchmarks -1080p (XanxoGaming)

For the most part, on the test suite we have, 1080p results are on par, except some few exceptions. Death Stranding, even though it has a 240 FPS cap, AVG FPS and 1% LOW results are a tad better for the AMD Ryzen CPU.

Out of the full test suite, three results stand out, even for 1080p. The point here is, which is the “fastest CPU” and as AMD Marketing showed, there is an actual difference in FFXV, Shadow of the Tomb Raider and The Witcher 3.

Out of those three, The Witcher 3 (Novigrad) stands out with a massive 22%. Checking some past results I did Alder Lake-S using DDR5 (did not put them since I want to retest ADL-S DDR5 6200 C40) it seems there is some bandwidth cap while using DDR4-3200 CL14 kits, or at least scales with faster kits on ADL-S in this title.

We did a real quick test with DDR4-3600 CL14 with 720p in The Witcher 3 that you will see later.

Another massive advantage comes in FFXV with a 29.16% advantage for the Ryzen 7 5800X3D.

In a lesser degree (but no smaller feat) team Red has an 10.36% advantage in Shadow of the Tomb Raider.

It seems that in 1080p, most games will see ties comparing Ryzen 7 5800X3D versus 12900K-DDR4 and some games giving a substantial win for the new 3D V-CACHE technology. 12900K-DDR5 results are to be seen, but we do expect better performance (compared to ADL-S DDR4) based on prior results we had.

Gaming Benchmarks – 720p – AMD Ryzen 7 5800X3D (XanxoGaming)

To sum it up, in our test suite, three games are tied up using DDR4-3200 CL14 kits on both systems. Assassin’s Creed Origins has a small victory by 5.4%. Death Stranding difference is small, but as seen in 1080p and even with the game’s 240 FPS cap, AVG FPS and 1% LOW are a little higher.

The same “phenoma” that we wrote about The Witcher 3 in 1080p, does occur in FFXV (both 1080p and 720p). Results for the Core i9-12900KF DDR4-3200 CL14 does not improve by lowering resolution in our custom scene, while it does for the AMD Ryzen 7 5800X3D.

What we presume is bandwidth cap and is very punishing for the 12900K DDR4-3200 CL14 in some titles and AMD’s solution (AMD 3D V-CACHE) pulls ahead in some titles.

Some other games see a small increase with AMD 3D V-CACHE, such as F1 2020.

Intel has a small win by 5% in Strange Brigade (DX12 – Async On) which I believe it is Alder Lake-S IPC (which is way better as seen in productivity benchmarks).

AMD Ryzen 7 5800X3D also has a victory by 12.32% in Shadow of War at 720p.

Bonus: The Witcher 3 – Intel Core i9-12900K DDR4-3600 CL14 data (XanxoGaming)

To check if there was any scalability with this title in particular, we installed 4 DIMMs of DDR4-3600 CL14 kits in our system. There was definitely an improvement over 3200C14 so next step is benchmark 12900K DDR5-6200 C40.

Go to thread

Don’t Miss Out on More FPS Review Content!

Our weekly newsletter includes a recap of our reviews and a run down of the most popular tech news that we published.

31 comments

  1. Yup - though I'd expected deltas to be within say 5% either way, and some of those are so large that they bring into question the testing premise. Not due to concerns over validity of the results, but of their applicability to any real conclusion.

    The 5800X3D has significantly more cache than we're used to on desktop CPUs. Ryzen has been tuned for this, IMO, as memory support has been slow going for AMD and Zen, whereas Alder Lake really isn't well-tuned for anything. DDR4 reports have been lackluster when it comes to pushing memory speeds, and DDR5 is a mess with few being able to utilize the faster kits even in XMP.

    However, Alder Lake does perform better with faster memory, including faster DDR5, as demonstrated on boards specifically designed to push memory speeds - but like DDR5, these boards aren't priced at pedestrian levels, with many lacking some form of connectivity available on more mainstream boards in one way or another. Main point is, while Alder Lake isn't pushing DDR4 memory speeds to stratospheric levels, it can take advantage of the additional bandwidth available on DDR5 - even with higher access latencies.

    Then there's the stock limiter on Alder Lake - which I get, as the 5800X3D can't overclock, and further the power draw difference will start to get out of control. At the same time, Alder Lake will scale to the limits of its cooling, the same way PBO seems to work for Zen CPUs. With the main difference being that Alder Lake will easily eat up more than 300W with the brakes off, while an eight-core Zen CPU doesn't stand a chance hitting that level of power draw.

    Then there's Windows 10. I'll say that for the most part, there's very little end-user benefit here outside of specific circumstances, but that doesn't mean that there's no benefit. Specifically, Windows 11 includes Intel's scheduler for Alder Lake. If we were looking at GPU-limited scenarios this might not make any measurable difference at all, but when isolating CPUs, it very much could - and could help explain some of the more puzzling results.

    On balance, we know how Alder Lake performs on Windows 10 and Windows 11. AMDs 5800X3D puts on a good showing, and while discussing the limits of CPU performance is likely to hold the community's attention, the big win here I think is value for the AM4 platform. Anyone not using an eight-core Zen 3 CPU or more will likely benefit substantially from upgrading to a 5800X3D, while not having to consider upgrading memory, motherboard, or cooling to get 99% of the performance available.

    Assuming that the $500 rumored price is true, that's a heck of a deal versus say an MSI Z690 Unify-X, a kit of DDR5 6400, and a 12900K (or KS!), as an example of what it might take to really wring the most performance out of Alder Lake - and you get to stay on Windows 10 a while longer!
  2. 1. Windows 11 runs just fine on Zen3. I did opt to go with the pro variant because I want the most control I can get over my operating system.

    2. The 5800x3d or whatever the full model is called is a great CPU for those that want a year or two more out of their AM4 platform.

    3. After this CPU generation will be the real tell on if AMD can sea saw with Intel on CPU performance, and if they can start to properly Sea Saw with Nvidia on GPU performance. Not to mention their move into big data center with the new high bandwidth I/O connectivity cards that are being introduced from them to compete with Nvidia and Intel.

    All three companies need to execute very strongly. The ones with the most to loose are the victors. But the ones that need the win here are the ones perceived as being out of step.

    Intel and AMD should be happy that Nvidia did not buy ARM and enter the CPU race. AMD and Intel should be nervous and watching what apple does with it's M1 chips VERY closely. They are getting sized to compete in the big data center environment. Will be interesting to see if Apple wants a piece of that pie again.

    The 5800x3D is a great CPU, I'm glad it competes where AMD is focused and does so well.

    The battle lines are drawn and it is now time to look to the next generation of CPU's from Intel and AMD. (Maybe apple)

    It is ALSO time to look to where the next generation of Video cards are going to fall performance/cost/wattage wise. Nvidia, AMD, and Intel have a stake in this game. (Again maybe Apple)

    To summarize.

    This is the last HARAH! for AM4. it will give folks something to extend the life of their AM4 based systems for a while longer that had a less than 8 core current generation AM4 CPU, OR any previous generation AM4 CPU focused on gaming.. It is a purely gaming focused CPU so it will be interesting to see how it performs in the sell through.
  3. Not sure why yall are saying this is only good for those with less than 8 core CPU's. This 5800X3D, from those charts, blows the doors off the 3700X/3800X, 2700X, 1800X. Now if you're already on a 5800X there's not much point in upgrading.

    This is easily going to be 40'ish % faster than my 3700X. And I'll get another 2-3 years of life out of my system.
  4. Not sure why yall are saying this is only good for those with less than 8 core CPU's. This 5800X3D, from those charts, blows the doors off the 3700X/3800X, 2700X, 1800X. Now if you're already on a 5800X there's not much point in upgrading.

    This is easily going to be 40'ish % faster than my 3700X. And I'll get another 2-3 years of life out of my system.
    You know what that is absolutely correct! I am revising my statement above.
  5. Yep, my mind is all but made up on getting one of these. I'll slap an AIO and call it a done deal for AM4 until I'm ready to upgrade. The article says MSRP is $449. Here's hoping I can find a sale or discount getting it closer to $400 or less before the holidays.
  6. I went and backtracked what I paid when I put together the X570 rig. This was in Dec. 2019.
    3700X $309
    X570 $199
    Ram $172
    PCIe 4.0 SSD $169

    I already had everything else on hand. I was just curious since I figure I'm looking upwards of $600+ to do this upgrade when I can.
  7. I went and backtracked what I paid when I put together the X570 rig. This was in Dec. 2019.
    3700X $309
    X570 $199
    Ram $172
    PCIe 4.0 SSD $169

    I already had everything else on hand. I was just curious since I figure I'm looking upwards of $600+ to do this upgrade when I can.
    Built mine in August 2019

    3700X $330
    X570 $300
    Ram $212
    1000W PSU $220
    2x 1TB m.2 $260
    Case $190
    Watercooling $1100

    A $450 upgrade at this point is worth it.
  8. All you guys upgrading 3000 series PCs. My 2700x is on the upgrade list - the only question is 5950 or one of these. Given it spends most of its time running distributed computing these days, the right choice is probably the 5950, but I’m going to check reviews.
  9. All you guys upgrading 3000 series PCs. My 2700x is on the upgrade list - the only question is 5950 or one of these. Given it spends most of its time running distributed computing these days, the right choice is probably the 5950, but I’m going to check reviews.
    Yeah, you're probably better off with a 5900 or 5950. The former has had some decent sales since the holidays. My 3700x is mostly used stuff I do here, browsing, and gaming, so the 3D definitely makes more sense for me.

    edit: Just noticed your sig. I see you're already testing a 5950X so maybe the 3D for gaming could be the better choice. Like you said, check the reviews.
  10. When we start talking about 'better for gaming' at this level of performance - really have to question how much sense that approach makes.

    Think about it this way: Counter-Strike performance improvements are still relevant for some folks. Not because better average framerates are needed, but because response times can always be lowered more.

    People serious about that game would likely buy 500Hz monitors if they were available to upgrade from their 360Hz monitors.

    I can tell you that my 12700K at stock is effortlessly faster than my 9900K was for content creation tasks - I have to run serious stress tests to really make it (and the poor 360mm AIO cooling it) start to sweat.

    But for games? At 3840x1600 with decent settings?

    Even with a 3080 12GB, I'd be hard pressed to tell the difference outside of just how hot the 9900K would get.
  11. When we start talking about 'better for gaming' at this level of performance - really have to question how much sense that approach makes.

    Think about it this way: Counter-Strike performance improvements are still relevant for some folks. Not because better average framerates are needed, but because response times can always be lowered more.
    I agree with this

    When you are judging this based on some game run at potato resolution getting 342 FPS vs 314 FPS -- at first, your already at some rediculous level that ~most~ people aren't even going to consider usable -- both because the refresh rate is much higher than can be used by most mortal folks, and because most people aren't going to play a game at a resolution of 320x240.

    So the differences that are getting discussed here, are mostly just academic. It exists, but much in the same way that a difference between the top speed in a Lambo versus Ferrari exist: yeah, it's there, but they are all so much higher than the Legal Speed Limit on a public road that most people drive on that the difference is, outside of some very specific circumstances, just academic.
  12. The difference should be noted in 1080 and 1440 resolutions where CPU will be more involved. This is where most people game. Myself at 1440. And the difference in FPS at 1440 between even my 3700X and a 5800X is substantial.

    Plus this is value upgrade to people's existing systems that adds longevity at a good price. I'm not ready to build a whole new system. If this gets me within a couple % of the top tier Intel then it's worth every penny.
  13. The difference should be noted in 1080 and 1440 resolutions where CPU will be more involved. This is where most people game. Myself at 1440. And the difference in FPS at 1440 between even my 3700X and a 5800X is substantial.

    Plus this is value upgrade to people's existing systems that adds longevity at a good price. I'm not ready to build a whole new system. If this gets me within a couple % of the top tier Intel then it's worth every penny.
    The value is in the eye of the beholder. Same reason I swapped motherboards when I upgraded to a 5900x.

    Now you could hit the issue a friend of mine is. His new build 3090 with a 5950 cpu. Big case lots of fans.. is causing his braker to flip. He is running a cheap 1050 watt gold rated power supply. Wants to go to a platinum so he can be sure he's delivering efficient power. Total overkill of a system for him but it's all good.
  14. Nice I went the the corsair apellix elite 360 myself. Too much other corsair stuff installed. Would love to hear what your experience is like. For me coming from a 3900x to a 5900x was nice.. but I also went from a PCIE 3.x to a PCIE 4.x drive as well.

Leave a comment

Please log in to your forum account to comment