• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Is 8GB of Vram enough for the 3070

Yeah, so, I got a VR headset to play over games like Alyx. All in all, I can safely say I will always be looking at nvidia products with some grain of salt from now on. 8Gb is nowhere near enough, at all. If you want to go VR, you probably need like 16GB at this point to have enough headroom for those insane projections that VR needs.

I mean, the problem right now is that Nvidia still performs better in VR, but even so, still has driver issues. AMD seems to perform worse in general on VR but also because of driver issues. I genuinely think this will change because they have the 16gb cards and Nvidia has the 8-10gb ones. I am at this point, regretting buying my 3070. It's just nonsensical to pay such amounts for a GPU that doesn't even have enough memory to begin with.
 


This is what I expect to see more of. Games using a flexible streaming model that adjusts for VRAM (more vram = better graphics and less pop in, less or not enough vram = lower image quality and more pop in)

not every game uses this yet, in watch dogs legion the graphics don't change - the framerate just tanks into the single digits bit it wold be better to maintain 60fps by lowering image quality and pop in

This is something reviewers will have to address sooner rather than later - because you can set the game to the same preset on two different cards but the card with less vram can lower the image quality and artificially improves its performance in reviews. People have been saying Nvidia lowers its image quality to improve performance in its driver, well now games are doing it. Game performance testing just became a whole lot more complicated
 

But But...... you don't need more than 8GB!!!!!

Reminds me of "But But.... you don't need more than 4 cores"



3060 12Gb is extremely weird.

With a 196Bit Bus your options are 6GB or 12GB and from the 6GB 2060 Nvidia realised people don't want £300 6GB card's its a shame Nvidia's fan base don't feel the same way about 8GB £500 cards.
 
Last edited:
Yup, the game was notorious for texture pop in/rendering, even on the ps 4 pro where I first played it. PC patch notes looks like they had to address multiple issues surrounding it:

https://horizon.fandom.com/wiki/Horizon_Zero_Dawn_updates

Here's hoping days gone and the next lot of PS games coming to PC are better done on first go!
so it can be fixed but in the some times mainstream 16 gb vram cards may turn them careless in future...

similar to how they dont care for the "gtx 1060 gang" that always on top of steam hardware survey charts. yet the gpu runs cyberpunk at 1080p 35-40 fps with low med mixed settings
 
so it can be fixed but in the some times mainstream 16 gb vram cards may turn them careless in future...

similar to how they dont care for the "gtx 1060 gang" that always on top of steam hardware survey charts. yet the gpu runs cyberpunk at 1080p 35-40 fps with low med mixed settings

I think the AMD cards also were having some issues with textures too though as there were a few videos where it looked like AMD had pop in where as nvidia didn't, basically, it was the game at fault not the hardware.
 
i tried replicating the issue on my 3070 but couldn't manage to do so

2160p ultra

icGarbn.png
EQDlXB8.png


Horizon zero Dawn won't crash, it's a different game - you guys should go watch hardware unboxeds explanation on this, horizon is one of the games that prevents itself from running out of vram by dumping data in vram and lowering some settings
 
With a 196Bit Bus your options are 6GB or 12GB and from the 6GB 2060 Nvidia realised people don't want £300 6GB card's its a shame Nvidia's fan base don't feel the same way about 8GB £500 cards.

Assuming sunny day here, I have seen plenty of inflated gouged and scalper sales adding chunks of % on top of those generous prices bug! :)
 
Anyone else set up their system to enable resizeable BAR feature? I just tested Cyberpunk 2077 and the issue I described before (going over 8gb VRAM causes a lag spike) seems to be less severe.

Cyberpunk is one of the games that supports BAR and it seems like, while the FPS does tank a little bit when I see the game go over the VRAM limit of my 3070, it no longer freezes up for a second like it used to.

If I understand correctly this is directly due to the BAR feature? Because the memory is no longer limited to chunks it can be cycled faster...? I am no exprt in this but this seems reasonable. Perhaps this was the reason why nvidia was so ballsy and greddy regarding the VRAM of these cards...? Since they knew the BAR thing would just eliminate the "negative" effects of not enough VRAM?

Well, that is just my theory anyways. It's definitely a great feature to have and should be a default option for us 3000 RTX users in all games from now on IMO.
 
Perhaps this was the reason why nvidia was so ballsy and greddy regarding the VRAM of these cards...? Since they knew the BAR thing would just eliminate the "negative" effects of not enough VRAM?

Well I think nvidia were so ballsy and greedy regarding the VRAM because they know 100% that people will still choose them over AMD anyway. Why add more vram when you`ll destroy the competition in sales anyway? The nvidia brand has so much mindshare that even a 6Gb 3070 would still have outsold AMD.
 
If you do not need RT then the AMD 6700 xt with it`s 12Gb makes Nvidia 3060Ti a poor choice if not paying FE prices. The same could be said for the 3070Fe but only just has the 256 bit really helps over the 192 bit on the 6700xt.
 
So far most games have been keeping memory usage below 8gb to suit Nvidia cards, because most people buy them. Now that AMD is becoming more popular that might change.
 
I've been playing Dirt 5 (yes I know, it's a driving game and arcade style, but I quite like it) oh ( and its been a bit buggy) anyway, the game is AMD favoured, so my 6900XT utilised upto10GB while playing it.
I suppose the game will adjust to whatever hardware it finds its working with??
 
So far most games have been keeping memory usage below 8gb to suit Nvidia cards, because most people buy them. Now that AMD is becoming more popular that might change.
what you say will happen regardless but for another reason. nvidia themselves will pump out 16 gb vram 70 and 80 series cards, then the developers will be free to not care for 8-10 gb vram cards and performance will suffer
 
Well if people conclude the 80 doesn't have enough. Then the 70 definitely doesn't :p.

I begrudge moving from an old gtx1080 to a 3070 and not getting any more memory.

:D

I'm not going to reply any further, especially to you, because you're really being dishonest at this point.

Case is closed for me. Have a good day.

Can't disprove the simple fact that a better cpu paired with your 3070 would give you a better experience so throw the hat in :cry:

U1Dswtp.gif

Have you got a 3070 and have tried this game? Looks pretty horrible to me on the 1440P part.

INhSQ6P.png

Nope I don't as it's not a card capable for gaming maxed out going forward for 1440p let alone 4k because of lack of vram AND lesser grunt than the 3080..... Hence why I didn't buy one.

Are you also disputing the fact that pairing a 3070 with a newer/better cpu would also give a better experience in this than running a 3070 with a 2700x? :D
 
Back
Top Bottom