• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Is 8GB of Vram enough for the 3070

Soldato
Joined
26 Aug 2004
Posts
5,032
Location
South Wales
1440P is the new 1080P - consoles are making 4K the norm. 1440P will be pretty disgusting in 2 years, let alone 4.
You keep using this argument yet you know full well the consoles can't handle 4k60 in demanding titles, the new Dirt game cut back graphics and resolution between 1440p and 1080p just to maintain around 90fps in high performance mode as shown by DF.

Complete rubbish as usual, pixel density is also a thing depending on screen size.
 
Associate
Joined
1 Oct 2009
Posts
1,033
Location
Norwich, UK
Too many people spreading false news, without hard facts. Even when it's been debunked, they will not stop believing... Some people just don't like to be wrong and will fight till the end... Sad really.

Yet we seem to keep repeating ourselves...

Getting pretty tiresome. :o

I'm personally willing to listen because maybe there's games which legit use a high amount of vRAM, and look good and run at a high frame rate. I'd be interested in testing that if nothing else so I learn something from it, I've had to learn a crap load of stuff to get up to speed with this topic which I've really enjoyed.

But everything suggested so far across both threads has been debunked with evidence, and lot of those claims I've just tested myself, it only takes 5 minutes to download MSI Afterburner Beta and config the tool and then just load up a game and see what you get.
 
Soldato
Joined
18 Feb 2015
Posts
6,484
You keep using this argument yet you know full well the consoles can't handle 4k60 in demanding titles, the new Dirt game cut back graphics and resolution between 1440p and 1080p just to maintain around 90fps in high performance mode as shown by DF.

Complete rubbish as usual, pixel density is also a thing depending on screen size.

And you can look for confirmation of this in the footnotes for every game, where they are all using dynamic res + some form of temporal accumulation (eg Spider-man doing temporal injection on PS5). In reality these are still 1440p consoles as it makes the most sense and going forward they will certainly not decrease graphical load so as to allow for 4K, quite the opposite.

I mean, if you look at the peak of console power as anticipated by devs it's the UE5 demo - which is 1440p 30 fps. There's some tweaking there for maybe 60 fps, but 4K too? Unlikely.
 
Soldato
Joined
18 May 2010
Posts
22,376
Location
London
The only fly in the ointment for the this vram_allocated vs vram_requested debate is when the game imposes a hard limit on the settings you can set based on the vram_requested parameter when the vram_allocated parameter ends up less.
 
Associate
Joined
2 Feb 2018
Posts
237
Location
Exeter
I'm personally willing to listen because maybe there's games which legit use a high amount of vRAM, and look good and run at a high frame rate. I'd be interested in testing that if nothing else so I learn something from it, I've had to learn a crap load of stuff to get up to speed with this topic which I've really enjoyed.

But everything suggested so far across both threads has been debunked with evidence, and lot of those claims I've just tested myself, it only takes 5 minutes to download MSI Afterburner Beta and config the tool and then just load up a game and see what you get.

Trust me, I know, been gaming on a 4K monitor since 2013, and everytime a new GPU comes out, it's the same old with people moaning about not enough VRAM without facts.

Currently gaming on a 5600xt at 4K to tie me over as I sold my 2080 Ti, awaiting what to buy next. Even with it's limited 6GB VRAM and 192bit bus, I've not hit VRAM limits in SOTR, RDR2, COD War one, Hitman 2, BFV and so on
With these games maxed out with AA.

So why some people say 8GB is not enough for 4K gaming are wrong. Turn AA off, you will save a ton of VRAM, as it really is not needed at 4K and you will have better clarity too.
Also some games have settings that just break the game engine with no real image quality benefits, just bragging rights...

So is 8GB enough VRAM till the next gen GPU's come out? I say yes.
But if you don't plan to upgrade at each generation, then no, 8GB is not enough for future proofing if you plan to skip a generation or two, not without dropping some settings.

Time will tell.
 
Associate
Joined
1 Oct 2009
Posts
1,033
Location
Norwich, UK
@Perfect_Chaos @Poneros I agree and I've seen that DF video on Dirt5, it's very thorough. It shows the consoles have enough horsepower to push the limits for one thing but by sacrificing the others. So it's either high frame rate, high resolution or high visual settings. Picking one dials the others back. 4k will be great for more casual games that don't go too heavy on effects but if we're talking the AAA titles here which most of us are, then they're not really 4k machines. The APUs have about as much GPU power as a mid range PC video card which aren't 4k ready for AAA games.

PrincessFrosty, brilliant post. WAY more effort than i'd put in to such a post. more effort than people here deserve, that's for sure. But regardless, *thumbs up*

Thank you. I'm glad some people get something out of it :cool: Best way forward is to simply look at some of the claims being made and then just test them.
 
Associate
Joined
1 Oct 2009
Posts
1,033
Location
Norwich, UK
The only fly in the ointment for the this vram_allocated vs vram_requested debate is when the game imposes a hard limit on the settings you can set based on the vram_requested parameter when the vram_allocated parameter ends up less.

It can do yes. People reported that Doom Eternal was preventing them from using Ultra Nightmare settings when they had less vRAM. And someone reported Red Dead Redemption 2 doing a similar thing although I've not confirmed that myself and it seemed to go against everyone's experience who replied.

Specifically with Doom Eternal vRAM increases with settings, however the setting "Texture Pool Size" is especially brutal on vRAM because it quite literally just alters a parameter in the game engine for how much vRAM is reserved for texture streaming. That value is "is_poolsize". It reserves a whopping 4.5Gb of vRAM for just that setting at Ultra Nightmare. That pushes the game up to a "real" usage of 7Gb of vRAM. But it's actually not necessary, the game is perfectly happy with a "high" setting for that where it uses about 1.5Gb, and a total of about 4-5Gb total. And shows zero visual benefit, it never needs to stream in and out 4.5Gb of textures. The dumb thing is you can just go into the console and type in is_poolsize "4608" and you'll get that "Ultra Nightmare" pool size. Or if that's slightly exceeding your vRAM budget you can just drop that value. That's literally all "Texture Pool Size" does in the video settings, is tweak that value.
 
Associate
Joined
17 Sep 2020
Posts
624
Fundamentally people just dont seem to understand how vRam works on the Amper cards with 6x memory, and compare it to prior gen tech and then spend far to much time and energy proving their right when they proved already fundamentally they don't even understand how 6x ram is stored, accessed and allocated on 6x memory on Ampere.... It's really tiresome but you guys keep chasing yer tails...
 
Associate
Joined
15 Jun 2009
Posts
2,189
Location
South London
8GB GPU for £600 or 16GB GPU for £600... it's a no brainer.

We would all take more for the same money. End of argument, end of story. It is a waste of time to "prove" what is needed or not.

I understand certain people are determined to be "right", but you can only guarantee one thing - The need for more VRAM will always rise over time. Just accept that and move on.
 
Soldato
Joined
6 Feb 2019
Posts
17,595
The "8gb is enough for 4k" crowd will now be working overtime to convince people after AMD's entire lineup ships with 16gb
 
Associate
Joined
25 Sep 2020
Posts
128
Would be a great card for that! Funny thing is though, the 6800 XT has a higher performance advantage vs the 3080 at 1440p compared to at 4K. Ampere is better suited for higher resolution thanks to its design, while RDNA 2 is better for high fps gaming. It's just the way they're designed.

Remember also, all DLSS gives you is an intermediate step between render resolution & output resolution in terms of better visuals/performance.

So what I mean is, if you're playing at 4K, think of it as giving you 1800p image quality for 1440p performance (quality mode). That's it, but it also requires per-game implementation. When you understand that then you won't be so blown away by DLSS "magic". It's decent, don't get me wrong, but it's blown waaaay out of proportion by the marketing material that's burrowed into people's minds. It's also much better at higher resolutions than lower ones, so won't be as good at 1440p compared to 4K (visually), and it's also better when being very GPU bound compared to a mix - so much less effective for high fps gaming.



https://www.nvidia.com/en-us/geforce/forums/3d-vision/41/294436/nvidia-dlss-your-questions-answered/
Thanks for the information. I see what dlss is now. I still think its a good technology, I mean it will probably help my card last longer, there is not guarantee of dlss coming to rdna2.... But even if they release a open source technology for dlss in the future nvidia will probably support the tech...
I haven't made my decision yet, Maybe that super resolution thing they showed could be a dlss replacement, gonna wait until early 2021 before making a purchase decision, wanna see what nvidia does...
A lot of the games I am interested in playing are going to feature dlss and ray tracing... Also I have heard they might implement a dlss 3.0 relatively soon, which will be a lot more easier to implement compared to dlss 2/2.1 lets see what comes of that...

I mean after seeing yesterday's slides I really wanna go amd, but as long as they don't have dlss I don't really feel like it.

I think dlss could give my card some extra longevity :p as it also reduces vram by a bit...
 
Associate
Joined
25 Sep 2020
Posts
128
AMD's offering seem to be better 1440p, so as you plan to keep the card for a long time AMD are offering you 6GB extra VRAM and better 1440p performance on all games. Where as nvidia will only likely have DLSS in a small portion of games released every year. In your position I would be going AMD. That said lets wait and see proper reviews before we jump to conclusions.
Okay, thanks for the response, I'll be waiting for reviews...

My concern is that maybe rdna2 might not support dlss-like technology who knows, maybe they will only support it rdna3 onwards lol. That is one reason I want to go nvidia as dlss is kind of the future, maybe in an open source version like DirectML. But it is going to help a lot as moore's law is slowly dying we might not get the same generational improvements as we have been getting after a few generations, so this upscaling tech is gonna come in handy...
 
Associate
Joined
25 Sep 2020
Posts
128
1440P is the new 1080P - consoles are making 4K the norm. 1440P will be pretty disgusting in 2 years, let alone 4.
What do you mean when you say disgusting? I mean I currently have 1080p so it should be an upgrade, 4k is still to expensive if i was 100+ fps. I will have to buy a new card every 2 years...
 
Associate
Joined
25 Sep 2020
Posts
128
Can counter your wall of text with a few facts:

1. Try play Doom Eternal 4k max on a 8GB card. You lose loads of performance, relative to cards with more memory, due to 8GB not being sufficient
2. Consider the new consoles (releasing soon!) are getting double the total memory vs the previous console generation. 8GB total last gen, 16GB total this gen. We can roughly say that console games will be able to double their VRAM used (whichever % of total memory gets dedicated to VRAM, consoles are ultra efficient/optimized, so hard to compare to PC).
3. Next gen AAA games (many are console 'ports') will very likely use anywhere up to double the VRAM they use now. Game developers are under pressure to leverage every MB of storage, every Mhz of processing power from the new consoles, you can bet texture sizes etc are about to explode.
4. Some of the games you mentioned (age of empires 3......) are based on very old games. These will run on an overclocked potato, and don't require a 3090. It's kind of pointless mentioning the about of memory a old game uses.....
5. Note to the few who are about to quote me and say that the consoles don't have 16GB of VRAM - I'm aware of this, and don't claim they do. They have 16GB total memory, double that of the 8GB previous generation. Double VRAM requirements INC yo!
um, I don't think pc requirements will double for atleast 2 years, that's how long it takes for games to drop support of last gen consoles, so the true next gen games will come out by 2022/2023 probably....

I suppose 10gb for 4k might take a blow by then but for 1080p and 1440p it should probably be enough...
 
Soldato
Joined
2 Feb 2010
Posts
10,769
Location
East Midlands
8gb is enough for now, but not if you plan on keeping your card any length of time.

I'm on the fence whether to buy one of these for this exact reason.

I like the look of the AMD cards, but they're too expensive for my blood. I just don't do enough gaming to be spending any more than £500. I just can't justify it.
 
Soldato
Joined
16 Jan 2006
Posts
3,020
Trust me, I know, been gaming on a 4K monitor since 2013, and everytime a new GPU comes out, it's the same old with people moaning about not enough VRAM without facts.

Currently gaming on a 5600xt at 4K to tie me over as I sold my 2080 Ti, awaiting what to buy next. Even with it's limited 6GB VRAM and 192bit bus, I've not hit VRAM limits in SOTR, RDR2, COD War one, Hitman 2, BFV and so on
With these games maxed out with AA.

So why some people say 8GB is not enough for 4K gaming are wrong. Turn AA off, you will save a ton of VRAM, as it really is not needed at 4K and you will have better clarity too.
Also some games have settings that just break the game engine with no real image quality benefits, just bragging rights...

So is 8GB enough VRAM till the next gen GPU's come out? I say yes.
But if you don't plan to upgrade at each generation, then no, 8GB is not enough for future proofing if you plan to skip a generation or two, not without dropping some settings.

Time will tell.

Maybe 4k no AA is fine on whatever size your monitor is but I still need it
 
Back
Top Bottom