Image: DICE

Early benchmarks have shown Battlefield 2042’s open beta requiring 16 GB of VRAM at 8K. A GeForce RTX 3090 and multiple Radeon cards were used in the tests. Up to 13 GB of VRAM was used at 8K in the Medium preset. The GeForce RTX 3090 manages an average of around 30 FPS at this preset. At Ultra, the frame buffer was filled with another 2 GB, bringing AMD cards to their 16 GB limit. At that point, none of the cards could go beyond 28 FPS.

The final game will have DLSS. DLSS could substantially improve the framerate on the GeForce RTX 3090, possibly bringing it closer to 60 FPS at medium. The GeForce RTX 3080 Ti was not used here, but similar results could be expected because of how it compares to the GeForce RTX 3090. There has been no word if FSR will be added to the game, which would also improve performance at 8K.

Image: Tweaktown

Other games have shown high VRAM usage at 8K. Even if some graphics cards have enough VRAM, they are a long way from what most would consider acceptable framerates. Though 8K gaming is not mainstream, it points to the trend of new games needing more memory. Graphics card manufacturers are adopting more and more memory.

[…] you will truly need 16GB of VRAM at least to run 8K without DLSS enabled. The 24GB of ultra-fast GDDR6X memory on the GeForce RTX 3090 is fine, as too is the 16GB of GDDR6 memory on the AMD Radeon RX 6800, Radeon RX 6800 XT, and Radeon RX 6900 XT graphics cards.

Source: TweakTown

Don’t Miss Out on More FPS Review Content!

Our weekly newsletter includes a recap of our reviews and a run down of the most popular tech news that we published.

Peter Brosdahl

As a child of the 70’s I was part of the many who became enthralled by the video arcade invasion of the 1980’s. Saving money from various odd jobs I purchased my first computer from a friend of my...

Join the Conversation

9 Comments

  1. Yea I’m still happy with 1440p for my computer and looking at 4k for the living room tv. 8k gaming is useless to me.

    Plus when the HECK will people understand that DLSS means you are NOT GAMING AT 4K or 8K. You might as well advertise and review the performance at the ACTUAL RESOLUTION THAT YOU ARE GAMING AT.

    I know that’s awful yelly and I apologize, it’s just better upscaling.

    woooooooooooohooooooooooo…. <-sarcastically impressed
    I mean am I wrong here?

  2. [QUOTE=”Burticus, post: 42387, member: 297″]
    8K here we come! Said no one ever.
    [/QUOTE]

    Yeah, 8K rendering isn’t [I]completely[/I] useless, but its uses sure are pretty limited.

    Estimates of the human eyes resolution range from approximately 0.39 to 0.59 arc minutes per line pair depending on the study. At that resolution you are going to outresolve the human eye, unless you are really close, but if you are that close, the whole screen won’t fit in your field of view, so you’ll be looking at just a small section of the screen.

    So with 8k:

    Have al screen size and distance combination that fills your field of view = no improvement in perception of quality over 4k.

    Have a screen size and distance combination where large portions of the screen fall OUTSIDE your field of view = may see an improvement, but you’ll be losing lots of screen real-estate, so what is the point?

    I’m sure 8k will hit at some point, and people will convince themselves that they just HAVE to have it even though there is no perceptible benefit, because that’s how these things seem to go but in real terms, it’s a waste of money, electricity, you name it.

    I see two potential benefits to 8k:

    1.) Large size screens where people only use a small portion of the screen at a time, like in a conference room where you may be looking at working on a small section of it, not looking at the rest.

    2.) As a form of anti-aliasing. We know DSR looks good. There is no better antialiasing. it’s pretty damn computationally expensive though, and there are probably better ways to do it.

  3. [QUOTE=”Zarathustra, post: 42365, member: 203″]
    Ok.

    But why would you play it at 8k?
    [/QUOTE]
    Here you go: [ATTACH type=”full”]1275[/ATTACH]

  4. [QUOTE=”Burticus, post: 42387, member: 297″]
    8K here we come! Said no one ever.
    [/QUOTE]
    It’s actually the first resolution jump in over a decade that I have no interest in. 1080p-sure, 1440p-I was all about it, 4K-I only just managed to get all my equipment synced up for it and have no desire for anything else right now. In ten years, maybe but by then I might be retired and no longer care anything about upgrades at all.

  5. [QUOTE=”Peter_Brosdahl, post: 42400, member: 87″]
    It’s actually the first resolution jump in over a decade that I have no interest in. 1080p-sure, 1440p-I was all about it, 4K-I only just managed to get all my equipment synced up for it and have no desire for anything else right now. In ten years, maybe but by then I might be retired and no longer care anything about upgrades at all.
    [/QUOTE]
    I think I’m with you here. I liked the jump from 1080 to 4k, although for gaming purposes it was meh. I don’t see any reason to jump to 8k really though.

Leave a comment