Associate
For 1440p 8gb right now is OK. Doom Eternal will use over 7 on Nightmare, but at 4k it will over flow the memory buffer. I don't base cards on what they can do. It matters more to me on what they can't. Like, for example right if a card does not have enough VRAM for even one game at 4k? then it is not a 4k card. Why do I look at it like this? because basically the price of these cards when compared to other gaming equipment.
I think that standard is calibrated a bit too sensitive, some games can be arbitrarily demanding because they've not been optimized or because they have settings which are "future proofed" which you see with games like the old Crysis and now of course Crysis remastered, and games like FS2020 these games can't run at 4k ultra even with a 3090, and to call that not a 4k card would be pretty controversial I feel. It's also the same with what I consider to be generally badly optimized games like Ark Survival evolved comes to mind, and maxed out barely maintains 60fps at 1440p on a 3080. This standard is way too sensitive to outliers especially when outliers might not be games you even play.
Doom Eternal was used, because it was a game that likes more VRAM than the 2080 they were comparing it to has.
Benchmarks that a cursory search shows are in conflict:
here https://www.techspot.com/article/1999-doom-eternal-benchmarks/
here https://www.techpowerup.com/review/doom-eternal-benchmark-test-performance-analysis/4.html
here https://www.kitguru.net/components/graphic-cards/dominic-moass/doom-eternal-pc-performance-analysis/
here https://www.eurogamer.net/articles/digitalfoundry-2020-nvidia-geforce-rtx-3080-review?page=2
All show 4k ultra/nightmare preset with 8Gb cards like the 2080 beating 11Gb cards like the 1080Ti
So, it should not simply be "Is 8gb on the 3070 enough" it should be "Which resolutions is it enough for". Hint - not 4k.
Right but by your own standards no card is 4k. You're taking a single dubious example of which there's a lot of evidence against and use that as a singular exception to say a card isn't 4k, by that set of standards it means no card is 4k.
And I'm going to get all autistic about this when it comes to Doom, but it uses the Id Tech 7 engine, an evolution of the id Tech 6 engine, and I investigated claims of 10Gb not being enough in the 3080 thread, using another game on this engine, Wolfenstein II. And the claims people had made were if you ran the game at what people were calling 8k or 16k textures, that you could reach like 18-20Gb of vRAM usage. After spending some time just investigating those claims it turned out what config they were editing was NOT texture resolution (as the term 8k textures and 16k textures would allude to) but rather the "is_poolSize" value which is the memory pool in vRAM measured in MB, allocated by the engine for texture streaming. And so what they were doing was setting a value of 16,000 for this which they thought meant 16k textures, and actually setting the vRAM reserved pool to 16Gb which accounted for the jump to like 18+Gb of vRAM usage. Note that the game pre-allocated all this vRAM even before a level was loaded so that vRAM usage was not a measure of what vRAM was being put to good use.
From my own testing actually setting texture settings in the menu for Wolfenstien II was altering this value in the config file, so the preset was setting like 1Gb, 2Gb, 4Gb as reserved values for this. It didn't have a direct impact on texture quality, it simply gave the engine a bigger memory pool to deal with texture swapping which in some areas in some circumstances stopped texture pop in. And all the people reviewing so called "16k textures" were doing side by side comparisons on youtube and seeing that indeed no there's no visible difference in game.
I feel this is relevant to Doom because all these texture presets are doing is the same thing as with Wolfenstien, they're pre-setting a pool of reserved memory in vRAM which is then used in some way in game to texture swap as you move through the level, first painting low quality textures to all the surfaces far away or out of sight, and then swapping them to full resolution as you get near them, then flushing old unused textures out of memory when you're far away from surfaces that use them.
So one question would be, does this even change the visual quality in game? Has anyone even checked? Has anyone checked memory usage with the new MSI Afterburner which measures per process and not just memory allocated? I don't own the game but I might acquire it at the very least just to double check some of these claims, because after a lot of research with Wolf 2 on similar claims I'm now familiar with how this all works and I'm highly skeptical. Especially considering inconsistent benchmark reviews for this game showing 8Gb cards on nightmare beating 11Gb cards just fine.
The problem as I see it? XBSX £450. Runs 4k. XB cheapo model? runs 1440p and is considerably cheaper than a 3070, plus the extra £500 in parts to make it a complete PC and of course then a monitor, keyboard, mouse, desk.
Euugh, no, not really. I mean it's marketed as 4k but let's face it the games are dumbed down with their visual settings until 4k is playable on them, the APUs in them are basically middle of the road compared to the PC which is why they're so cheap. To jackknife so seamlessly from expecting utra super dooper pooper mode in Doom on the PC to basically any old middle of the road setting (equivalent) the consoles will use, is really...weird.