• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Is 8GB of Vram enough for the 3070

Permabanned
Joined
7 Oct 2018
Posts
2,170
Location
Behind Pluto
I have an RTX 3070 and I am gaming at 1440P Ultra-wide (Resolution 3440x1440). It's so sad the card is practically performing at 2080/2080ti levels but is being held back by the VRAM 100% on this title least. It's depressing to look at. Future proof my ass, I bought it specifically for the Cyberpunk launch as it was the only RTX I could find for a not-over-the-top price in my area. Being new to RTX, as I upgraded from GTX 1080, of course I started having weird lag spikes and began to monitor everything. So, the major lag spikes started happening when the game reached 7,9GB of VRAM. The GPU usage then spikes down quickly, resulting in a stutter, and then goes back up, which I assume is because of the VRAM.

So, no, even if Cyberpunk is the new "Crysis" of our time, The 3070 isn't a 1440p card and I'm surprised it is being marketed as such... It's a 1080p card if you want to run the higher settings. I'd say for 1080p the 8GB will be more than enough. I mean, honestly, I am not sure what I expected. I guess I should have heeded all the "doomsayers" when the cards were announced and people were yelling about the VRAM for the 3070/80 cards. This really is the first time I am dissapointed in Nvidia, it is a major dissapointment being able to use DLSS and RT options only to have the game tell me **** you due to the vram. FYI if I turn off the RTX options altogether the 8gb is enough.

Whoever is saying "MUH silly 8gb elitism" just doesn't know what they're talking about. No card should have to struggle with its own VRAM (Not one that is released in 2020 anyways). I understand that I may have set the settings high up there but the FPS is fine (I come come from playing FPS like CS GO and Siege, so I am quite elitist about frames, but I fine tuned the settings until I got acceptable overall framerate for the action parts of the game.

I saw the 3070 being marketed as "good for 1440p", which is the only reason I am mad - I do believe it could be, but not with 8gb of VRAM. That and I should have not been in the "muh silly 8gb elitism" category beforehand.

Just to play the devils advocate here... You could say that without RT features the 8gb is probably enough even for 1440p. But I would prefer not to play a game of "will it be enough vram?" for every upcoming title. I mean surely I am not being unreasonable here in saying that for a 3070 if you want to game at 1440p (or at least 3440x1440 which is 30% wider) you should skip this card, or be prepared to adjust game settings to accomodate the lack of sufficient VRAM for this particular card.

It's funny how we went from the VRAM being overcompensated (8gb was the standard since the GTX cards came out) to vice versa.

One of the buggiest, crappiest games to exist and you judge your purchase over this 1 game.

To be fair with you, I don't own Cybershite 1997 as it was never a game I thought would live up to it's hype and I was correct, just as I thought Fallout but online was a terrible idea at least logically that turned out true not for logical factors but for just how low effort it was.

The latest game I own would perhaps be Horizon Zero Dawn which don't eat my VRAM as neither do any other games.

All Resident Evil games claim I exceed my 8GB Vram buffer in the options menu but never actually use it in game, Village especially even with RT enabled. 2560x1440.


It seems people who enjoy low effort games also do low effort posts too.
 
Permabanned
Joined
7 Oct 2018
Posts
2,170
Location
Behind Pluto
Hmmmm. Is the RTX 3090 meant to be 800% faster than the 3070?

Strange.

Maxed out RTX 3070 GPU usage, yet only under 100 watts board power?

Must be smoking those funny fags again eh?

Most likely a fully faked video or purposely limited the RTX 3070.

My RTX 3070 at full tilt is doing 255-265w power pull.
 
Permabanned
Joined
7 Oct 2018
Posts
2,170
Location
Behind Pluto
Here is my latest purchase... let's see if it runs out of Vram!

I can update every post with the game I shall play maxed out and we can play the numbers game if you all want.

Untitled2.png


Untitled.png
 
Permabanned
Joined
7 Oct 2018
Posts
2,170
Location
Behind Pluto
RE Village almost maxed out... I like some settings lowered as I prefer higher FPS instead of useless FPS killers, the stuff that matter are maxed out though and we are way above the Vram suggestions LOL.

Of note, when I stopped recording, the Vram usage dropped sharply under 5GB to 4875mb, I was using NVENC through OBS.

 
Man of Honour
Joined
13 Oct 2006
Posts
91,166
One of the buggiest, crappiest games to exist and you judge your purchase over this 1 game.

To be fair with you, I don't own Cybershite 1997 as it was never a game I thought would live up to it's hype and I was correct, just as I thought Fallout but online was a terrible idea at least logically that turned out true not for logical factors but for just how low effort it was.

The latest game I own would perhaps be Horizon Zero Dawn which don't eat my VRAM as neither do any other games.

All Resident Evil games claim I exceed my 8GB Vram buffer in the options menu but never actually use it in game, Village especially even with RT enabled. 2560x1440.


It seems people who enjoy low effort games also do low effort posts too.

Old post you are replying to there but I think SnakeHelah was encountering a not uncommon bug with the game - with my 3070 I could play for around 2 hours at 1440p all settings ultra including ray tracing with DLSS quality mode and ~60 FPS w/ upper 6GB VRAM used then it would degrade quite quickly with VRAM shooting up to 7.3+GB used, FPS dropping to ~45 until I saved and restarted at which point it would go back to ~60 FPS for about another 2 hours. For some reason it would run poorly like that straight away for some people.

https://forums.overclockers.co.uk/posts/34382802
 
Permabanned
Joined
7 Oct 2018
Posts
2,170
Location
Behind Pluto
Horizon textures at dawn.

Dem textures not be dawning on me as lower res bruh.
Probs because your 8GB AMD GEEPEEYUU was the issue, not it's Vram as is typical of AMD hardware. Shiet drivers.

 
Permabanned
Joined
7 Oct 2018
Posts
2,170
Location
Behind Pluto
Oh and just as the final nail.

OY0AjkV.jpg

Even Nvidia know 8gb isn't enough for 4k.
The GPU it's self is not enough for 4K let alone the Vram amount it has... I don't even think nVidia aimed a generally lower high end GPU at that resolution now did they?
Then again they probably overshot their own intelligence level thinking commoners would have common sense.

Almost like the Vram fit's it's best use-case. 2560x1440 gaming.

Dun dun dun!!!!!!!
 
Last edited:
Permabanned
Joined
7 Oct 2018
Posts
2,170
Location
Behind Pluto
You're going to be right into GPU limitations in 4k.

The 3070 FE benchmarks out today https://youtu.be/NbZDERlshbQ show it running about 63fps average in RDR in 4k high, 33fps average in Total War: 3 Kingdoms 4k/Ultra, Shadow of the Tomb raider in 4k/High 65fps. Many of the games in this review were running in high presets and not Ultra and so if what you're looking to do is max out Ultra performance in 4k then a 3070 is going to be borderline on many games out right now. And those games are not bottlenecked by vRAM or even really anywhere close to it.

The games out in a few years time may creep closer to an 8Gb vRAM limit but they're also going to hammer the GPU to death. You aren't going to benefit from more vRAM if the GPU can only play the game at like 20fps.
Smart man.
 
Permabanned
Joined
7 Oct 2018
Posts
2,170
Location
Behind Pluto
I don't care what it's doing. All I care about is not buying a £500+ GPU for my £1000 PC (rough figure for demonstration purposes) and it not tanking and running games slowly. I don't care about all of the pedantic boring stuff, I just don't want certain games tanking or becoming unplayable.

As such? he or she could have a degree in astro physics I really couldn't give a toss. Like I said, if I know I have to start cutting settings I will just buy a console. They are much cheaper.

TBH? I don't even really care about the cut in quality on consoles. I wouldn't, I own three (two XB1X, one PS Pro) and I really enjoy using them. However, that said I did not spend £3000+ on them. Had I done that on a rig and then found myself compromising? yeah no thanks.

At least AMD seem have this covered.

One of the thickest members on here to be fair, have you measured yourself yet vs a plank of wood?

Yes you don't care other than to hear your own stupid mouth talk.


Why is it we have to put up with low intellect people? Never the other way around is it...?
It's all about you!
 
Permabanned
Joined
7 Oct 2018
Posts
2,170
Location
Behind Pluto
It's enough for now, but I seriously doubt it'll be enough in two years time. Nvidia have really stiffed us on the vram this time.

I think they realised a lot of people skipped Turing. They know that if they skimp on vram on the 30 series, when the next gen comes, more people will buy if they have extra vram. It's a bit of a con really.
It's 2022.

Where are you?
 
Permabanned
Joined
7 Oct 2018
Posts
2,170
Location
Behind Pluto
I had about four posts I was going to reply to with this but yours was the latest :D

What irks me the most about the 3070 is that it is fast. Really fast. For £500? it is a pretty epic bargain. However, it should be able to do 4k. Times have changed and we have moved on. At one point the Fury X was a 4k card. At one point the 980Ti and 1080Ti were 4k cards. They didn't cost much more than the 3070, and it is now what? four? five? years later.

It seems to be that the only thing stopping it being a true 4k card is the memory. Note I mean that broadly. IE, yeah it could well be a combination of not enough and not enough bandwidth. However, that's a shame, because it means at 4k you are better off buying a used 2080Ti. I would like to have seen a scenario where that wasn't a good idea.

That is why I am most annoyed and disappointed. Everything on the card itself is more than capable of 4k, yet they have derped it by not giving it enough VRAM for now, let alone the future. Had they upped it? then the bandwidth would have upped also and it would have lasted you twice to three times as long. Of course I don't expect that to come for free, and would have been happy to see it cost £100 more. However, if it is to be taken at face value (IE a 1440p card) then like I have said it's quite expensive. The 2080 Super will do just as well in 1440p for gaming (obviously the FPS will be lower, but the experience would be more than good enough).

As I said though, the most annoying thing is that this would be a perfectly capable 4k card had they not derped it. Which is a shame.


I do love me some of ALXAndy...

So you expect yesterdays GPU performance to carry you with 4K from inifinity and beyond?

Yet you were calling the Vram yesterdays Vram...?

This is my last time responding to you.

That plank of wood might have you beat. :eek:
 
Last edited:
Soldato
Joined
19 Jan 2010
Posts
4,806
So how come warzone will use all 24gb of my 3090 vram right up to the game crashing. I have it limited it to 0.55 in the advanced settings so that the game (only) uses 17gb....
 
Permabanned
Joined
7 Oct 2018
Posts
2,170
Location
Behind Pluto
So how come warzone will use all 24gb of my 3090 vram right up to the game crashing. I have it limited it to 0.55 in the advanced settings so that the game (only) uses 17gb....
Because WarZone loves you.
I don't play it in fairness and I am not downloading it either, waste of HDD space and SSD writes.

From what I can gather though most people are completely fine.


 
Back
Top Bottom