• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Is 8GB of Vram enough for the 3070

I use a 3070 currently because I couldn't get a 3090, yes I can't work with my usual 8k workflow anymore without proxies however, it's absolutely fine for 4k 120 fps h.265 & raw footage, I'm sure it'll be okay for his unreal engine, if anything it should offload to system ram aswell, not that I used unreal engine much.... I mean, people do unreal engine with outdated cards, the ampere architecture is superior to Turing for workstation, if anything a faster CPU would load up unreal engine faster and reduce times too

Excellent thank you, it seems at 1440p he'll be fine with 3070FE then and can upgrade CPU (and RAM) to 5900/5950X later down the road.
 
Excellent thank you, it seems at 1440p he'll be fine with 3070FE then and can upgrade CPU (and RAM) to 5900/5950X later down the road.
3080 appears to be quite a lot faster(40% ish at 1440) in unreal engine however, if you don't want to spend the extra £200 or what is it, 3070 seems to be fine for it.
 
3080 appears to be quite a lot faster(40% ish at 1440) in unreal engine however, if you don't want to spend the extra £200 or what is it, 3070 seems to be fine for it.

Yeah it's quite a big gap, we will see what Santa says regards 3080 ;):D
 
EvPIYJ.jpg
2im4nf.jpg

https://youtu.be/XhddaznExCA

Godfall 4k epic settings(maxed out)

6g-8gb allocation, and only 5.5gb-6.5gb actual usage.

Sooooooo, 12gb my ass lmaaao
 
3080 appears to be quite a lot faster(40% ish at 1440) in unreal engine however, if you don't want to spend the extra £200 or what is it, 3070 seems to be fine for it.

If I'd have been able to get one I'd have taken the 3080 over the 3070, the value seems to be in the 3080s favour. Will have to grab a 3080Ti next year.
 
What? Nobody is using a 3070 for 4K Cyberpunk 2077, I only manage 55 fps on a 3090 with RTX off... What do you draw your conclusion that 8gb isn't enough? Apparently nothing is enough for Cyberpunk 2077 at 4K.

The fact that even at 1440p the 2080Ti is beating it very well. Because the 3070 is streaming, because it runs short on VRAM. You know? the last 24 page's worth of arguing.

I have been waiting for one of these tech channels to show the 2080Ti. I mean, I know it's "OLD TECH INNIT END OF LIFE MATE" but I was at least expecting to see it put vs the 3070 and 3080 in this game. Seems the only channel to do it is HWUB and they are the channel who just got shunned by Nvidia for not sitting in a corner pulling themselves to pieces over anything but RT and DLSS performance. I guess Nvidia "advised" (or rather, bullied) the other tech channels into not including the 2080Ti for these reasons. Who in their right mind would buy a 3070 with 8gb of VRAM when you can buy a used 2080Ti for exactly the same price or cheaper?
 
I own a 3070 and have not ran into VRAM issues at 4K with any other AAA titles I own.
Cyberpunk 2077 is a buggy mess, so calling on ONE unoptimized game to say a 3070 with 8GB is not enough because not even a 3090 can run the game well with all the bells and whistles, is trash talking...

Was this your argument when the original Crysis was first released with not even an 8800 GTX able to run it maxed out?...

You know, that turning down certain settings that don't impact the clarity can make the game playable, rather than turning on "Break the game engine graphic settings..."

I rest my case. Back to gaming.
 
The fact that even at 1440p the 2080Ti is beating it very well. Because the 3070 is streaming, because it runs short on VRAM. You know? the last 24 page's worth of arguing.

I have been waiting for one of these tech channels to show the 2080Ti. I mean, I know it's "OLD TECH INNIT END OF LIFE MATE" but I was at least expecting to see it put vs the 3070 and 3080 in this game. Seems the only channel to do it is HWUB and they are the channel who just got shunned by Nvidia for not sitting in a corner pulling themselves to pieces over anything but RT and DLSS performance. I guess Nvidia "advised" (or rather, bullied) the other tech channels into not including the 2080Ti for these reasons. Who in their right mind would buy a 3070 with 8gb of VRAM when you can buy a used 2080Ti for exactly the same price or cheaper?

Buy a 2080 Ti for £1149 instead of a 3070 for £469 then.

Your call.

Meanwhile I only get 22 fps in Cyberpunk 2077 at 4K RTX Ultra on a 3090, so even if your game requires more than 8gb vram, you probably won't be able to run it properly. There are few games that do require more than 8gb though, actually REQUIRE it, not ALLOCATE it.

Silly '8GB NOT ENUF' elitism.
 
Last edited:
Buy a 2080 Ti for £1149 instead of a 3070 for £469 then.

Your call.

Meanwhile I only get 22 fps in Cyberpunk 2077 at 4K RTX Ultra on a 3090, so even if your game requires more than 8gb vram, you probably won't be able to run it properly. There are few games that do require more than 8gb though, actually REQUIRE it, not ALLOCATE it.

Silly '8GB NOT ENUF' elitism.

Thanks for reading my post. You know? that part where I specifically said you can get a 2080Ti for the same price as a 3070.
 
Andy, the 2080 Ti you got for that price was about a month before the release of the 3080. With all the uncertainty of whether the 3070 was really going to be as quick as a 2080 Ti for £479 made many people sell off their 2080 Ti to grab a 3080/3090.
But, as we all know how that played out with low stock issues etc. causing scalping, and hiked up prices - Means, you was lucky at that time to get one so cheap. Try today, and you will have no chance in the second hand market.

3070 are selling for £600+ on ebay, 2080 Ti a faction more.

So your point is invalid at this point of time.
 
Well even if 8gb not enough for 10% of the titles at 4k ultra. That means at 1440p(more common) 8gb will be fine for 1-2 years at ultra, and 2-3 years at 1080p at ultra. Probably 3-4 years at high/medium textures at 1440p tho.

At 4k 8gb is not enough for the future. It will run 90% games out right now and is probably fine if you are fine with turning textures down a notch in 1-2 games out of 10
 
Well even if 8gb not enough for 10% of the titles at 4k ultra. That means at 1440p(more common) 8gb will be fine for 1-2 years at ultra, and 2-3 years at 1080p at ultra. Probably 3-4 years at high/medium textures at 1440p tho.

At 4k 8gb is not enough for the future. It will run 90% games out right now and is probably fine if you are fine with turning textures down a notch in 1-2 games out of 10
Nobody really buys a 3070 for 4K when you can get a 3080 forn £200 more, besides 4K is a pointless resolution unless you're playing on a 120Hz display.
 
Nobody really buys a 3070 for 4K when you can get a 3080 forn £200 more, besides 4K is a pointless resolution unless you're playing on a 120Hz display.
True, 1440p144hz over 4k60hz anyday for me too. However I am buying by march/april so i might be able to buy a 3070ti 10gb(or 16gb??) if that launches by then.
 
I have an RTX 3070 and I am gaming at 1440P Ultra-wide (Resolution 3440x1440). It's so sad the card is practically performing at 2080/2080ti levels but is being held back by the VRAM 100% on this title least. It's depressing to look at. Future proof my ass, I bought it specifically for the Cyberpunk launch as it was the only RTX I could find for a not-over-the-top price in my area. Being new to RTX, as I upgraded from GTX 1080, of course I started having weird lag spikes and began to monitor everything. So, the major lag spikes started happening when the game reached 7,9GB of VRAM. The GPU usage then spikes down quickly, resulting in a stutter, and then goes back up, which I assume is because of the VRAM.

So, no, even if Cyberpunk is the new "Crysis" of our time, The 3070 isn't a 1440p card and I'm surprised it is being marketed as such... It's a 1080p card if you want to run the higher settings. I'd say for 1080p the 8GB will be more than enough. I mean, honestly, I am not sure what I expected. I guess I should have heeded all the "doomsayers" when the cards were announced and people were yelling about the VRAM for the 3070/80 cards. This really is the first time I am dissapointed in Nvidia, it is a major dissapointment being able to use DLSS and RT options only to have the game tell me **** you due to the vram. FYI if I turn off the RTX options altogether the 8gb is enough.

Whoever is saying "MUH silly 8gb elitism" just doesn't know what they're talking about. No card should have to struggle with its own VRAM (Not one that is released in 2020 anyways). I understand that I may have set the settings high up there but the FPS is fine (I come come from playing FPS like CS GO and Siege, so I am quite elitist about frames, but I fine tuned the settings until I got acceptable overall framerate for the action parts of the game.

I saw the 3070 being marketed as "good for 1440p", which is the only reason I am mad - I do believe it could be, but not with 8gb of VRAM. That and I should have not been in the "muh silly 8gb elitism" category beforehand.

Just to play the devils advocate here... You could say that without RT features the 8gb is probably enough even for 1440p. But I would prefer not to play a game of "will it be enough vram?" for every upcoming title. I mean surely I am not being unreasonable here in saying that for a 3070 if you want to game at 1440p (or at least 3440x1440 which is 30% wider) you should skip this card, or be prepared to adjust game settings to accomodate the lack of sufficient VRAM for this particular card.

It's funny how we went from the VRAM being overcompensated (8gb was the standard since the GTX cards came out) to vice versa.
 
Last edited:
Back
Top Bottom