Image: Counterplay Games

The minimum and recommended PC specifications that we shared for Godfall yesterday hinted that the fantasy looter-slasher would require some pretty beefy hardware, especially at higher graphics settings.

This has been confirmed in a new video starring Counterplay Games CEO Keith Lee, who made some very interesting comments about the kind of hardware gamers will need to experience Godfall at its maximum fidelity.

Lee revealed that Godfall requires 12 GB of VRAM to run at the 4K resolution setting with Ultra HD textures enabled. That’s bad news for NVIDIA’s GeForce RTX 3080 and GeForce RTX 3070 graphics cards, which only have 10 GB and 8 GB of memory, respectively.

The implication is that NVIDIA fans who wish to run Godfall at its best will require a GeForce RTX 3090, a $1,499 GPU with 24 GB of VRAM. Otherwise, AMD’s new Radeon RX 6000 models – all of which flaunt 16 GB of GDDR6 – should cope nicely.

We’ve copied some of Lee’s comments below, but you can check out the full video at the bottom of this article for a look at what Godfall looks like running in 4K.

At 4K resolution using Ultra HD textures, Godfall requires tremendous memory bandwidth to run smoothly. In this intricately detailed scene, we are using 4K x 4K texture sizes and 12 gigabytes of graphics memory to play at 4K resolution.

The Infinity Cache on AMD’s Radeon RX 6000 Series cards runs Godfall with high frame rates with maximum settings enabled. […]

What you see here is us maxing out the image quality settings to deliver extraordinary visuals while still delivering very high sustained frame rate. We achieve this through Variable Rate Shading, also known as VRS.

We have also overhauled our lighting and shadow systems, leveraging DXR 1.1 ray tracing to realistically model shadows in the scene more closely to what happens in a real world. […]

Of course, ray tracing shadows involves incredibly complex computations, and the Radeon RX 6000 Series GPUs are able to handle it with ease.

Moreover, we’ve enabled the Fidelity FX Contrast Adaptive Sharpening , also known as CAS, to sharpen and improve the overall textures and edges of our scenes.

Don’t Miss Out on More FPS Review Content!

Our weekly newsletter includes a recap of our reviews and a run down of the most popular tech news that we published.

Join the Conversation

25 Comments

  1. Will wait to see actual benchmarks before I fall into the “Sky is falling”

    I mean, after all, if you had wanted to play with 4K MAX ULTRA SUPER settings, you wouldn’t have skimped with “just” a 3080 and you would have shucked out the cash for the 3090, right?

    Also, I’m sure DLSS will save this.

  2. [QUOTE=”Brian_B, post: 22735, member: 96″]
    Will wait to see actual benchmarks before I fall into the “Sky is falling”

    I mean, after all, if you had wanted to play with 4K MAX ULTRA SUPER settings, you wouldn’t have skimped with “just” a 3080 and you would have shucked out the cash for the 3090, right?

    Also, I’m sure DLSS will save this.
    [/QUOTE]
    Except NVIDIA has been saying that the 3080 is the 4K card and the 3090 is an 8K card.

  3. I kind of hope it does 100% require 12GB of VRAM at max settings to see how NVIDIA responds to people after they promised that 10GB would be fine.

    That said I am more on the side that it allocated 12GB of VRAM and requires much less.

  4. Whether poorly optimized or truly needing it, I’ve been seeing a few games skim around, or over, the 10 GB mark for years now in 4K. This isn’t that much of a surprise. It is a shame that NV made the 3080 with 10 GB because it will see limits with other games I’m sure. The rumored 20 GB variant is probably going to seem a lot more appealing as other games go over 10 GB as well. Meanwhile, I’d say the current 3080 is probably a fantastic 1440P card and will do fine with less demanding titles in 4K.

  5. I play at 4k with texture heavy games. Vram size is an issue for me. I’ve maxed out the 11gb in my 1080 ti – not going to pay good money go backwards. So long as benchmarks look decent I will likely be picking up an AMD card. I think the price of the 3090 is ridiculous.

  6. Question with Vram heavy games is if they need it or fill it because it’s there. Now this one might actually need it, but some games don’t realy, will see what happens, might go for an AMD one for the memory if they can be found, may need to get a freesync monitor too then.

    1. If a game is trying to handle more textures that fit into Vram then the system has to keep swapping things out, taking the time to load new textures as needed from main ram or the hard disk – this can cause slow downs, stalls, and stuttering.

  7. Much like SQLserver for example, many games will take as much memory as possible, that doesn’t mean it actually needs it.

    That said, 10GB does fill short, even for 4K

  8. Will be hilarious when this unoptimized clusterfuck of a game takes up all the VRAM regardless of amount.

  9. [QUOTE=”Auer, post: 22808, member: 225″]
    Will be hilarious when this unoptimized clusterfuck of a game takes up all the VRAM regardless of amount.
    [/QUOTE]

    Hardly. Just because most games don’t use that much doesn’t mean that utilizing 12GB of VRAM is a problem. Godfall may very well employ features, textures, or is in some way designed to utilize extra VRAM. Modders have been using extra VRAM since the beginning of texture modding. Every texture I created for ME3 mods were 4x the size of their originals. Do that enough, and you can eat up a lot of VRAM. I’m not saying that’s what’s going on here, but it’s ridiculous to assume that [I]”being unoptimized”[/I] is the only reason a game could possibly use 12GB of VRAM at 4K.

  10. [QUOTE=”Dan_D, post: 22828, member: 6″]
    Hardly. Just because most games don’t use that much doesn’t mean that utilizing 12GB of VRAM is a problem. Godfall may very well employ features, textures, or is in some way designed to utilize extra VRAM. Modders have been using extra VRAM since the beginning of texture modding. Every texture I created for ME3 mods were 4x the size of their originals. Do that enough, and you can eat up a lot of VRAM. I’m not saying that’s what’s going on here, but it’s ridiculous to assume that [I]”being unoptimized”[/I] is the only reason a game could possibly use 12GB of VRAM at 4K.
    [/QUOTE]
    Well, at this stage assumptions is all I got.

  11. [QUOTE=”Auer, post: 22832, member: 225″]
    Well, at this stage assumptions is all I got.
    [/QUOTE]

    That’s my point. Without having more information, assuming that using 12GB of VRAM at 4K comes down to a lack of optimization is premature. There is simply no reason to think that at this time.

  12. [QUOTE=”Dan_D, post: 22836, member: 6″]
    That’s my point. Without having more information, assuming that using 12GB of VRAM at 4K comes down to a lack of optimization is premature. There is simply no reason to think that at this time.
    [/QUOTE]
    As is assuming its because of the 4k textures.

  13. [QUOTE=”Stoly, post: 22840, member: 1474″]
    As is assuming its because of the 4k textures.
    [/QUOTE]
    The developer stated that using 4k textures uses 12gb… It’s not as much of an assumption as assuming that it’s just unoptimized.
    A 4k texture takes about 128mb+ depending on features (RGB alpha, displacement, surface normal, etc). So, if you have 100 textures, that’s 12+ gb alone.. not including frame buffer, internal render buffers, not including any geometry or other required data… Which means in reality it’s probably closer to 50 4k textures, which doesn’t seem that far out there for a scene. Maybe they won’t all be on screen simultaneously and there will only be small infrequent dips in performance… Who knows until we actually have testing done. Either way, if it’s texture size or poor optimization, the only thing that matters to a gamer is how it runs on their hardware.

  14. [QUOTE=”Ready4Droid, post: 22841, member: 245″]Either way, if it’s texture size or poor optimization, the only thing that matters to a gamer is how it runs on their hardware.
    [/QUOTE]

    And that’s the main issue here. The games runs poorly even without RTX/ultra settings. We’ll have to see how it performs on the 6000 series to find out if its really bad programming or the RTX series just can’t handle it.

  15. [QUOTE=”Stoly, post: 22840, member: 1474″]
    As is assuming its because of the 4k textures.
    [/QUOTE]

    I never said or assumed it was due to 4K textures. I simply used modding games as an example of how higher resolution textures consume more VRAM. I simply pointed out that there are other causes besides: [I]”the game isn’t optimized”.[/I]

  16. [QUOTE=”Dan_D, post: 22849, member: 6″]
    I never said or assumed it was due to 4K textures. I simply used modding games as an example of how higher resolution textures consume more VRAM. I simply pointed out that there are other causes besides: [I]”the game isn’t optimized”.[/I]
    [/QUOTE]
    I get it, its just that there are so many examples of bad optimization specially on PC ports, that I just had to go with the odds… 😀 😀 :rolleyes::rolleyes:

  17. [QUOTE=”Stoly, post: 22865, member: 1474″]
    I get it, its just that there are so many examples of bad optimization specially on PC ports, that I just had to go with the odds… 😀 😀 :rolleyes::rolleyes:
    [/QUOTE]

    Well, its a definite possibility. But, I am not going there without having played the game or seeing it run on modern hardware in the hands of the public.

  18. [QUOTE=”Brian_B, post: 22868, member: 96″]
    What happened to the “I’m glad games are pushing hardware envelope” people?
    [/QUOTE]

    I’m right here. Pretty much alone.

  19. [QUOTE=”Brian_B, post: 22868, member: 96″]
    What happened to the “I’m glad games are pushing hardware envelope” people?
    [/QUOTE]
    All for it of course, as long as it’s not because of incompetence or lazy dev shit.

    RT fits that category too, and I’m a fan.

  20. [QUOTE=”Dan_D, post: 22869, member: 6″]
    I’m right here. Pretty much alone.
    [/QUOTE]
    I’m with you. I’m also of the opinion that the word “optimize” gets thrown around too much by people who have no idea what the concept means.

  21. Also, about pushing hardware…

    It probably is not in the best financial interest of a publisher to have a game with very high requirements. From a sales point of view.

    That should mean that it would be smart to have well optimized games. And yet…

  22. [QUOTE=”Auer, post: 22898, member: 225″]
    Also, about pushing hardware…

    It probably is not in the best financial interest of a publisher to have a game with very high requirements. From a sales point of view.

    That should mean that it would be smart to have well optimized games. And yet…
    [/QUOTE]

    Ideally, games would be able to push the boundaries of what modern hardware could do, yet scale well with older hardware and still look appealing. On the subject of optimization, its not necessarily a matter of that. Sometimes choices were made in the design which cost performance but may have made the game look better. Crysis famously did this and its probably the sole reason why the game has aged as well as it has.

    That said, games that aren’t optimized worth a crap are often a result of developers doing the bare minimum to get things to run on the PC platform. Often because publishers who are shouldering much of the development costs push companies into releasing product quickly.

Leave a comment